← Back to Library

Meta’s oversight board takes on the israel-hamas war

Casey Newton exposes a critical paradox in digital governance: Meta's Oversight Board is racing to adjudicate war crimes in real-time, only to find the company has already rewritten the rules to make its own intervention unnecessary. This piece is essential listening because it reveals that the true crisis isn't just about which videos stay up, but that automated systems and understaffed human teams are failing to distinguish between terrorist propaganda and the raw, horrific testimony of civilians caught in a crossfire.

The Speed of Crisis vs. The Slowness of Justice

Newton frames the Oversight Board's recent move as a desperate attempt to prove relevance during an unfolding catastrophe. The board selected two cases for an expedited review, a drastic departure from its usual three-month timeline. "For the first time, the board said it would conduct its review on an expedited basis — meaning that its decision could come in as soon as 48 hours, and up to 30 days," Newton notes. This acceleration is significant, yet it highlights a systemic failure where the company's reactive measures outpace the very body designed to check them.

Meta’s oversight board takes on the israel-hamas war

The two cases selected illustrate the impossible tightrope Meta walks. One involved a graphic video of the aftermath of a strike on Al-Shifa Hospital, initially removed for violating rules on violent content. The other showed a hostage being taken by Hamas militants, removed under policies prohibiting the depiction of terrorist incidents. Newton points out the stark contradiction in Meta's enforcement: "The removal of the first post, then, speaks to anxieties that Meta is acting too aggressively in restricting speech by Palestinians who are speaking out about the terrible impact of the war on civilians. The removal of the second post reflects the opposite anxiety: that Meta is acting too aggressively to silence Israelis who are speaking out about the atrocious October 7 attacks and their aftermath."

This duality suggests that the platform's moderation is not a neutral application of rules, but a chaotic reaction to political pressure and algorithmic overreach. The human cost here is invisible to the code; a child's death or a hostage's plea becomes merely a data point to be flagged and deleted.

In any case, neither removal would stand. Meta restored the first post as soon as the board told the company it was selecting it for appeal; the company restored it to Instagram behind a warning screen.

Newton observes that the mere announcement of the board's involvement triggered immediate reversals. This creates a strange dynamic where the board's authority is undermined by the company's pre-emptive capitulation. "If you believe that Meta erred in its initial decisions in the above cases, you should be glad that the board intervened and got them restored to Facebook and Instagram within a few weeks," Newton writes. However, he immediately pivots to the darker implication: "At the same time, Meta's quick action in response to the board could have the odd effect of making the board's expedited review moot."

The Illusion of Policy and the Reality of Failure

The core of Newton's argument is that Meta is confusing an enforcement failure with a policy problem. The company claims its rules are nuanced enough to allow graphic content in the context of human rights abuses. "In the context of discussions about important issues such as human rights abuses, armed conflicts or acts of terrorism, we allow graphic content (with some limitations) to help people to condemn and raise awareness about these situations," Newton quotes from Meta's policy. Yet, the reality on the ground is a flood of errors.

Newton speculates that these errors stem from the sheer volume of content and the trauma inflicted on the moderators themselves. "It seems likely that one of the company's contracted content moderators — or automated systems — made a mistake. It's a deeply unsatisfying answer, particularly given the high-stakes nature of the error." The board has seen appeals triple since October 7, a statistic that Newton argues is a direct result of the company's "layoff-heavy 'year of efficiency'" which slashed moderation teams. Critics might argue that Meta has simply not invested enough in the infrastructure required to handle a global conflict, but Newton suggests the issue is deeper: the design of the system itself.

The designation of Hamas as a terrorist organization creates a structural bias. "Determining which posts in a war zone are coming directly from Hamas, and which are coming from average Palestinians, is difficult, nuanced work," Newton explains. When a platform is hyper-sensitive to accusations of hosting terrorist propaganda, the default setting becomes over-enforcement against the entire population associated with that group. "Moderators being asked to make judgment calls dozens or even hundreds of times a day are bound to make mistakes — and Hamas' status as a terrorist organization all but ensures that over-enforcement of rules against Palestinians will continue."

There is no number of cases the board could take, or speed with which it could adjudicate them, that will alter that basic dynamic.

This is the piece's most sobering insight. The Oversight Board, often criticized for being too slow, is now moving at the speed of light, yet it cannot fix a broken machine. Newton argues that the board risks "mistake[ing] an enforcement issue for a policy problem." The solution isn't just better rules; it's a fundamental re-evaluation of how a profit-driven algorithm handles the suffering of civilians in a war zone.

Bottom Line

Newton's strongest contribution is his refusal to let Meta off the hook with a simple "policy update" narrative, exposing instead a systemic inability to scale human judgment during a humanitarian crisis. The argument's vulnerability lies in the fact that without a radical shift in how platforms prioritize human rights over liability, no amount of board oversight can stop the erasure of Palestinian voices. Readers should watch to see if the board moves beyond restoring individual posts to demanding structural changes in how Meta's automated systems are trained to recognize the difference between a terrorist and a victim.

Sources

Meta’s oversight board takes on the israel-hamas war

by Casey Newton · Platformer · Read full article

Today let’s talk about the Oversight Board’s move to weigh in on the ongoing controversy about how Meta is moderating content related to the war between Israel and Hamas. The board is making an effort to show that it can prove useful during an unfolding crisis. But given the speed and scale of the conflict — and Meta’s cool reception to many of the board’s recent ideas — it’s unclear whether the result will go much beyond restoring a couple of posts to Facebook and Instagram.

On Thursday the Oversight Board, a Meta-funded but independent body that is empowered to make binding decisions about whether posts on the company’s apps should come down or stay up, announced it would take two cases stemming from the Israel-Hamas war. For the first time, the board said it would conduct its review on an expedited basis — meaning that its decision could come in as soon as 48 hours, and up to 30 days. (Regular decisions typically take the board about three months.) 

The first case selected today concerns an Instagram post:

[It] includes a video showing what appears to be the aftermath of a strike on a yard outside Al-Shifa Hospital in Gaza City. The content, which was posted on Instagram in early November, shows people, including children, injured or dead, lying on the ground and/or crying. A caption in Arabic and English below the video states that the hospital has been targeted by the “usurping occupation,” a reference to the Israeli army, and tags human rights and news organizations. Meta initially removed the post for violating its rules on violent and graphic content. 

The second is a Facebook post.

[It] shows a woman begging her kidnappers not to kill her as she is taken hostage and driven away on a motorbike. The woman is seen sitting on the back of the motorbike, reaching out and pleading for her life. The video then shows a man, who appears to be another hostage, being marched away by captors. In a caption, the user who posted the content describes the kidnappers as Hamas militants and urges people to watch the video to gain a “deeper understanding” of the horror that Israel woke up to on October 7, 2023. The user posted the content around a week after the October 7 attacks. Under its Dangerous Organizations and Individuals policy, Meta has designated Hamas as a Tier 1 ...