How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
On December 3, 2020, the Oversight Board selected a case referred by Facebook regarding a post in a group that appears to exist for Muslims in India. The post contains a statement about a sword being taken from its scabbard if people speak against the prophet. The post also references President Emmanuel Macron of France. The Facebook company deemed this post a veiled threat, and we took it down for violating our policy on violence and incitement, as laid out in the Facebook Community Standards.
Facebook referred this case to the board as an example of a challenging decision about statements that may incite violence even when not explicit. It also highlights an important tension we face when addressing religious speech that could be interpreted as a threat of violence.
On February 12, 2021, the board overturned Facebook's decision on this case. Facebook acted to comply with the board’s decision, and this content has been reinstated.
On March 11, 2021, Facebook responded to the board’s recommendation for this case. We are committing to take action on the recommendation.
Provide people with additional information regarding the scope and enforcement of restrictions on veiled threats. This would help people understand what content is allowed in this area. Facebook should make their enforcement criteria public. These should consider the intent and identity of the person, as well as their audience and the wider context.