Meta is increasing enforcement of its policies against violent posts and misinformation amid the Israel-Hamas war as charged images and posts balloon. The Elon Musk-backed company, along with X (formerly Twitter) and other social media platforms, has faced pressure from Europe to remain vigilant on misinformation throughout the conflict in light of European Union law that requires them to monitor and remove illegal content from their sites.
Meta said in a blog post-Friday that it’s created a special operations center with experts fluent in Hebrew and Arabic and has “removed or marked as disturbing” more than 795,000 pieces of content since the start of the war on Oct. 7. The company says Hamas, which is considered a terrorist organization in the United States and other Western countries, is banned from its sites.
In addition, Meta said it’s also taking more temporary measures like blocking hashtags and prioritizing Facebook and Instagram Live reports relating to the crisis. The company also says it’s temporarily expanding its violence and incitement policy, which can lead to removing authentic content that identifies hostages taken by Hamas, even if the intent is to condemn or raise awareness of their plight.
The company says that since the start of the conflict, it has removed seven times more pieces of content daily for violating its Dangerous Organizations and Individuals policy in Hebrew and Arabic compared to two months before. It’s also removing seven times more posts that violate its incitement and violence policies daily in those languages versus the same period last year.
Despite the efforts by Meta and other companies to keep harmful or disturbing content off their sites, some users are using their social media accounts as a platform for antisemitism and other forms of hatred against Israel. New York Attorney General Letitia James has written to Google, X, TikTok, Reddit, and Rumble to request information about their efforts to identify and remove disinformation that can lead to hate crimes and other violence in the wake of the Israeli-Hamas conflict.
In an open letter, James writes that the companies “have been criticized for failing to remove antisemitic and Islamophobic content, but I believe these efforts do not go far enough.” She also asks the companies to commit to removing “violent and dangerous content” within 24 hours and to provide more details about how they are addressing the crisis. She says that if the companies do not act, she will file civil and criminal charges in state court for violations of New York law. She has asked the companies to provide specific examples of content that has been identified, removed, or blocked, including the dates when it was posted and where the material originated. In a letter, Alphabet CEO Sundar Pichai emphasized that the company has an office in Israel and supports the country, urging employees to be safe and stand together against antisemitism. The letter responded to one that Pichai received from the European Union’s chief negotiator on internet matters, Thierry Breton.