Policies that outline what is and isn't allowed on the Facebook app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others and more.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
Quarterly report on how well we're doing at enforcing our policies in the Facebook app and on Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional Internet restrictions that limit people's ability to access the Internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
As the world changes, so do our Community Standards. Our Content Policy team consults with external stakeholders from around the globe to discuss potential updates.
Doing something is a good start, but it isn't enough. That's why we report enforcement metrics publicly so people can understand and track how we're doing over time.
We work with other social platforms and law enforcement to prevent harm online and offline.
We talk to stakeholders such as civil society organisations, activist groups, thought leaders and academics to hear their expertise and feedback as we develop our Community Standards. Our Content Policy team integrates this feedback to help ensure inclusiveness, expertise and transparency throughout the policy development process.
As our technology improves, we can find and review the highest-severity content faster. These improvements help our review teams prevent more harm, more quickly.