Automated tools used by Facebook parent company Meta to police potentially harmful content needlessly removed two videos related to the Israel-Hamas war, the Meta Oversight Board said in a statement on Tuesday. Moderation technology could prevent users from viewing content related to human suffering on both sides of the conflict, the report said.
The comments, the result of the oversight board's first "rapid review", highlight the intense scrutiny social media companies face over their handling of conflict-related content.
The board overturned Meta's initial decision to remove two pieces of content. As part of its decision, the group urged Meta to respect users' rights to "free speech... and their ability to communicate during this crisis."
"The committee is focused on protecting the free speech rights of people on all sides about these horrific events while ensuring that no testimony incites violence or hatred," committee co-chair Michael McConnell said in a statement. statement. "These testimonies are important not only for speakers, but also for users around the world who are seeking timely and diverse information about groundbreaking events."
In response to the board's decision, Meta said that since both items had been reinstated prior to the board's decision, no further action would be taken. "Expression and safety are important to us and to the people who use our services," the company said in a blog post.
On February 2, 2023, a security guard stood next to a sign at Meta headquarters in Menlo Park, California. Facebook parent company Meta reported better-than-expected fourth-quarter earnings, with revenue of $32.17 billion.
Meta's oversight board will evaluate the company's handling of Israel-Hamas war content
Meta’s Oversight Board is an entity composed of experts in areas such as free speech and human rights. It's often described as Meta's Supreme Court because it allows users to appeal content decisions on the company's platform. The board advises the company on how to handle certain content moderation decisions, as well as broader policy recommendations.
The committee said earlier this month it decided to review the case more quickly because decisions on war-related content could have "urgent real-world consequences." The committee said the average daily number of user complaints about content decisions "relevant to the Middle East and North Africa region" nearly tripled in the weeks after the conflict between Israel and Hamas began.
Meta told CNN in October that it had established "a special operations center staffed by experts, including those fluent in Hebrew and Arabic, to closely monitor and respond to this rapidly evolving regional situation and is coordinating with third-party fact-checkers."
The oversight board said on Tuesday that after the conflict broke out, Meta took interim steps to address potentially dangerous content, including lowering the threshold for automatic removal of content that may violate its hate speech, violence and incitement, and bullying and harassment policies.
"In other words, Meta is more aggressively using its automated tools to remove potentially banned content," the board said, adding that the company took these steps to prioritize security, but that the move also "increases the likelihood that Meta will mistakenly remove non-non-non-non-content." As of December 11, Meta had not returned its automated systems' content moderation thresholds to normal levels, the board said.
The committee's review looked at two parts: a video posted on Instagram that appeared to show the aftermath of the strike outside Gaza City's Al-Shifa hospital, and another video posted on Facebook that showed two hostages being kidnapped by Hamas militants.
The first video appeared to show "people, including injured or dead children, lying on the ground and/or crying." The committee said captions in Arabic and English below the video referred to Israeli forces and said the hospital had become a "target of a 'usurped occupation'".
Meta's automated systems initially removed the post because it violated rules about graphic and violent content. One user appealed the decision, asking for the video to be reinstated, but Meta's systems automatically denied the request after determining with a "high degree of confidence" that the content violated its rules. After the board decided to accept the case, Meta published the video and issued a warning about its disturbing content; the warning also prevented minors from viewing the video and recommending the video to adult users.
The oversight board said on Tuesday that the video should never have been removed in the first place and criticized Meta's move to restrict its circulation, saying it was "inconsistent with the company's responsibility to respect free speech."
A second video reviewed by the committee showed a woman and a man on a motorcycle being taken away by Hamas militants, with a caption urging people to watch it to gain "a deeper understanding" of the Oct. 7 attack on Israel.
Meta initially removed the post because it violated its Dangerous Organizations and Individuals policy, which prohibits the posting of images of terrorist attacks targeting apparent victims, even if shared to raise awareness of such attacks. (Meta designates Hamas as a dangerous organization under its policies and labeled the Oct. 7 attack a terrorist attack.)
After the board took up the case, the company reinstated the video with a warning screen as part of a larger effort to allow limited exemptions from its dangerous organizations and individuals policy when the content is intended to condemn, raise awareness about, or report kidnappings. , or call for the release of hostages. As with other videos, the warning screen limits the visibility of videos to minors and prevents them from being recommended to other Facebook users.
As was the case with the first video, the board said the content should not be removed, saying that preventing the video from being recommended would put Meta in breach of its human rights responsibilities.
Meta said on Tuesday it would not change recommended limits on two videos reviewed by the board because the board disagreed with the limits but made no formal recommendations on what to do with the videos.
“The Committee considers that excluding content that raises awareness of potential violations of human rights, conflict or acts of terrorism from the recommendations is not a necessary or appropriate restriction on freedom of expression, given the high level of public concern about such content,” it said in its decision.