Policies that outline what is and isn't allowed on the Facebook app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
Meta takes a three-part approach to content enforcement on Facebook and Instagram: remove, reduce and inform.
The Facebook Community Standards outline what is and isn't allowed on Facebook. The Instagram Community Guidelines outline what is and isn’t allowed on Instagram. We remove content that goes against our policies as soon as we become aware of it.
Some problematic content can create a negative experience for people on Facebook and Instagram. We'll often reduce the distribution of this content, even when it doesn’t quite meet the standard for removal under our policies.
When content is potentially sensitive or misleading, we sometimes add a warning or share additional information from independent fact-checkers.