Policies that outline what is and isn't allowed on the Facebook app.
Policies that outline what is and isn't allowed on the Instagram app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
2024-002-FB-UA, 2024-003-FB-UA
Today, February 8, 2024, the Oversight Board selected a case bundle appealed by Facebook users concerning two posts regarding the electoral process in Australia during the 2023 Australian Indigenous Voice referendum.
The first piece of content contains a screenshot of tweets from the Australian Electoral Commission which discuss the issue of individuals voting more than once. The accompanying caption stated, “So it is official. Go out, vote early, vote often, and vote NO.” The second piece of content shared a screenshot of one of the same tweets from the Australian Electoral Commission with text overlay that stated “[t]hey are setting us up for a ‘Rigging’.. smash the voting centres people it’s a NO, NO, NO, NO, NO.”
Meta took down both pieces of content for violating our policy on Coordinating Harm and Promoting Crime, as laid out in the Facebook Community Standards. For the second piece of content, the phrase “smash the voting centres” was taken down under our policy on Coordinating Harm and Promoting Crime for advocating to inundate the election with duplicate voting. The same phrase can also be interpreted as a call to literally destroy the voting center buildings, which violates our policy on Violence and Incitement, as laid out in the Facebook Community Standards.
In accordance with our policy on Coordinating Harm and Promoting Crime, Meta prohibits “facilitating, organizing, or admitting to certain criminal or harmful activities,” including “advocating, providing instructions for, or demonstrating explicit intent to illegally participate in the voting process.” Additionally, Meta does not allow threats of violence against a place if they could “lead to death or serious injury of any person that could be present at the targeted place.”
We will implement the board’s decision once it has finished deliberating, and will update this post accordingly. Please see the Board's website for the decision when they issue it.