Oversight Board cases

UPDATED

JUN 2, 2021

The Oversight Board can hear content cases from either Facebook directly or people on the Facebook app or Instagram who disagree with Facebook’s decisions. With the exception of cases Facebook refers for Expedited Review, the board ultimately selects the cases it wants to hear.

All selected cases

Case title
Selection date
Region
Violation type
Status
Brazilian state-level health authority’s post about COVID lockdowns

2 June 2021

Latin America and Caribbean

Not applicable

Pending decision

Situation in Myanmar while using profanity

20 May 2021

Central and South Asia

Hate Speech

Pending decision

Support of Abdullah Ӧcalan, founder of the PKK

20 Apr 2021

Central and South Asia

Dangerous Individuals and Organizations

Pending decision

Armenian people and the Armenian Genocide

2 Mar 2021

Central and South Asia

Hate Speech

Decision issued

January 2021 protests in Russia

2 Mar 2021

Europe

Bullying and Harassment

Decision issued

Punjabi concern over the RSS in India

Feb 9 2021

Central and South Asia

Bullying and Harassment

Decision issued

Depiction of Zwarte Piet

29 Jan 2021

Europe

Hate Speech

Decision issued

Former President Trump’s suspension from Facebook

21 Jan 2021

United States and Canada

Not applicable

Decision issued

Veiled threat based on religious beliefs

3 Dec 2020

Central and South Asia

Violence and Incitement

Decision issued

Hydroxychloroquine, Azithromycin and COVID-19

1 Dec 2020

Europe

Violence and Incitement

Decision issued

Nazi quote

1 Dec 2020

Not applicable

Dangerous Individuals and Organizations

Decision issued

Violence against French people

1 Dec 2020

Central and South Asia

Hate Speech

Case withdrawn

Uyghur Muslims

1 Dec 2020

Asia-Pacific and Oceania

Hate Speech

Decision issued

Armenians in Azerbaijan

1 Dec 2020

Central and South Asia

Hate Speech

Decision issued

Breast cancer symptoms and nudity

1 Dec 2020

Latin America and Caribbean

Adult Nudity and Sexual Activity

Decision issued

How the board's decisions impact other content

Facebook will implement the Oversight Board’s decision across identical content with parallel context, if it exists and when it is technically and operationally possible.

We’ll first use the decision to determine what constitutes identical content, such as another post using the same image that the board decided should be removed. We’ll then use the decision to define parallel context—for example, if the image is shared with the same sentiment. Part of this process involves reassessing the case content’s scope and severity based on the board’s decision.

Our Operations team will then investigate how the decision can be enforced consistently against other content on the Facebook app or Instagram. There are some limitations for removing seemingly identical content, including when it’s similar but not similar enough for our systems to identify it. Through this process, our Operations team can ensure the board’s decision on one piece of content can be reflected across other content on Facebook or Instagram.