Oversight Board Selects a Case Regarding a Post Discussing the Situation in Ethiopia

UPDATED

JUN 12, 2023

2021-014-FB-UA

Today, the Oversight Board selected a case appealed by a Facebook user regarding a post that criticizes the current situation in Raya Kobo, Ethiopia and accuses the Tigray People’s Liberation Front (TPLF) and ethnic Tigrayan civilians of various violent crimes.

Facebook initially took down this content for violating our policy on hate speech, as laid out in the Facebook Community Standards. However, upon further review, we determined we removed this content in error and reinstated it.

We will implement the board’s decision once it has finished deliberating and will update this post accordingly. Please see the board’s website for the decision when they issue it.

Case decision

We welcome the Oversight Board’s decision today on this case. Meta has acted to comply with the board’s decision immediately, and this content has been removed.

In accordance with the bylaws, we will also initiate a review of identical content with parallel context. If we determine that we have the technical and operational capacity to take action on that content as well, we will do so promptly. For more information, please see our Newsroom post about how we implement the board’s decisions.

After conducting a review of the recommendations provided by the board in addition to their decision, we will update this post.

Recommendations

According to the Oversight Board’s bylaws, Meta has 30 days to respond to the board’s recommendations. The Oversight Board issued its decision on December 14, 2021 and today, January 13, 2022, our detailed response can be found below.

Meta is committed to protecting our community by removing any content from our services that could contribute to a risk of real world harm. Meta’s policies already prohibit this content on the platform, and we have invested significant resources in Ethiopia to identify and remove potentially harmful content. We have been working for years to understand and address hate speech in countries at risk of conflict, remove coordinated inauthentic behavior, and combat misinformation.

In its decision, the Oversight Board determined that Meta should have removed a post in Ethiopia claiming that some civilians were assisting the Tigrayan People’s Liberation Front in committing atrocities against other civilians. The board held that Meta should have determined, based on a lack of corroboration in the post for the person’s claim, that the post constituted an "unverified rumor" that significantly increased the risk of imminent violence and should have been removed. Meta respectfully disagrees with the reasoning of the board’s decision because it would impose a journalistic publishing standard on people that could prevent them from raising awareness of atrocities or other hazards in conflict situations where real-time verifications are unlikely. The ability to share and access this type of information can be critical to the safety of the community.

Regardless of Meta’s disagreement with the board’s decision, we welcome the board’s oversight role. As we noted above when the board issued its decision, we removed this content at the board’s direction. In response to the board’s three recommendations in this case, we are implementing in part one recommendation, assessing the feasibility of implementing another and will have no further updates on one recommendation.

Recommendation 1 (implementing in part)

Meta should rewrite Meta’s value of “safety” to reflect that online speech may pose risk to the physical security of persons and the right to life, in addition to the risks of intimidation, exclusion and silencing.

Our commitment: In response to the board’s recommendation, we will update our value of safety in our Community Standards as follows:

"We’re committed to making Facebook a safe place. We remove content that could contribute to a risk of harm to the physical security of persons. Content that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook."

Considerations: As part of our commitment to protecting our community, Meta removes content from our services that could contribute to a risk of real-world harm. Our goal is to ensure that both expression and personal safety are protected and respected on our platforms. This principle has long been reflected in our policies, and we will update our value of safety to reflect it as well in response to the board’s recommendation.

Next steps: We will update our value of safety to reflect the board’s recommendation and expect to publish this update by the end of this month. We will continue to adjust our policies and values as needed to mitigate any risk of contributing to offline harm. We continue to welcome guidance from the board to help us in these efforts. We will provide an update on our implementation of this recommendation in a future Quarterly Update.

Recommendation 2 (no further action)

Facebook’s Community Standards should reflect that in the contexts of war and violent conflict unverified rumors pose higher risk to the rights of life and security of persons. This should be reflected at all levels of the moderation process.

Our commitment: We will continue to work with trusted partners and independent fact checkers to identify and remove misinformation that may contribute to the risk of imminent harm while protecting people’s ability to report on events in real time and to receive information.

Considerations: We agree with the board that greater risks to the rights to life and security of persons exist in contexts of war and violent conflict. We also recognize that in these high-risk areas, real-time reports of violence or other information can play a critical role in safety and raise global awareness, especially when journalists cannot access the area due to the ongoing conflict. To ensure we are balancing the critical need to protect a person’s voice and their safety, we invest significant resources in safety and security measures for at-risk countries. This includes Ethiopia, which has been one of our highest priorities for country-specific interventions, given the longstanding risks of conflict.

Those safety efforts take into account context and involve (among other things) identifying and removing persistently harmful false claims, improving hate speech enforcement, and expanding our policies on coordinating harm, bullying and harassment, and veiled threats. Each of these actions is highly relevant to contexts of armed conflict. To the extent the board’s recommendation suggests that we take into account the heightened risk associated with the context of war and conflict, this is work we already do.

We updated our policy to address unverifiable rumors in 2019 after carefully considering input from 49 experts globally, including academics, human rights experts and civil society organizations. As stakeholders suggested, we have worked to identify and limit the spread of unverifiable rumors that could contribute to a risk of harm while strengthening work with local partners to understand critical context. We further developed this work with trusted partners in conflict zones, such as Myanmar, Ethiopia and the Sahel to identify persistent claims that, if false, are likely to contribute to the risk of imminent physical harm. Identifying such persistently harmful claims speeds up our removal of potentially harmful misinformation.

Our policy, as informed by our stakeholder engagement, intentionally addresses “unverifiable” as opposed to “unverified” rumors. We do not agree that the appropriate way to balance voice and safety is to remove more reports from conflict zones as “unverified rumors” when we have no signal from a trusted partner or third party fact-checker that those reports are false or could contribute to a risk of harm. We do not remove merely “unverified” rumors, claims or information, given these may be accurate statements of personal experience or observation. We remove “unverifiable rumors,” which we define as rumors that cannot be confirmed or debunked in a meaningful timeframe, when we have the necessary information or context suggesting they are likely to contribute to a risk of imminent physical harm. Especially in contexts of war and violent conflict, it is often not possible to verify information quickly. Removing everything that is unverified could lead to the removal of accurate claims by observers or victims of crimes against vulnerable people.

Meta is concerned that removing content based on its own judgment about the level of evidentiary support contained in someone’s posts would lead to arbitrary results and would suppress potentially accurate reports that could protect others and raise awareness of atrocities. The board’s recommendation would impose a journalistic publishing standard on people that could prevent them from raising awareness of atrocities or other hazards in conflict situations where real-time verifications are unlikely. We will continue to rely on trusted partners with local knowledge to inform us when information is false or unverifiable and may lead to imminent harm.

Next steps: We will have no further updates on this recommendation.

Recommendation 3 (assessing feasibility)

Meta should commission an independent human rights due diligence assessment related to our work in Ethiopia.

Our commitment: Meta has conducted multiple forms of human rights due diligence related to Ethiopia, and we are continually adjusting and improving our policies and mitigations to address real world issues. We are committed to knowing, mitigating and preventing salient human rights risks. Not all elements of the board’s specific recommendation may be feasible in terms of timing, data science or approach. We will work to update and share insights and actions from our due diligence that aligns with the board’s goals, our Human Rights Policy and the UN Guiding Principles on Business and Human Rights (“UNGPs”).

Considerations: Human rights due diligence projects can be highly time intensive, often running a year or more. Methodologies are largely qualitative, and rights holders in conflict zones may have security or other concerns that inhibit their participation. Meta is committed to good practice human rights due diligence. We will assess the feasibility of a focused human rights due diligence project that honors the board’s intent; enables us to address emerging harms; and is well aligned to the UNGPs, including UNGP Principle 21 on public communications.

Next steps: We will continue existing human rights due diligence and dynamic risk management processes and assess the feasibility of a related due diligence project. We anticipate providing an update within the next few months.