Policies that outline what is and isn't allowed on the Facebook app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others and more.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
Quarterly report on how well we're doing at enforcing our policies in the Facebook app and on Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional Internet restrictions that limit people's ability to access the Internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
We may remove content created for the purpose of identifying a private minor if there may be a risk to the minor's safety when requested by a user, government, law enforcement or external child safety experts.
See some examples of what enforcement looks like for people on Facebook, such as: what it looks like to report something that you don't think should be on Facebook, to be told that you've violated our Community Standards and to see a warning screen over certain content.
Note: We're always improving, so what you see here may be slightly outdated compared to what we currently use.
We have an option to report, whether it's on a post, a comment, a story, a message or something else.
We help people report things that they don't think should be on our platform.
We ask people to tell us more about what's wrong. This helps us send the report to the right place.
After these steps, we submit the report. We also lay out what people should expect next.
After we've reviewed the report, we'll send the reporting user a notification.
We'll share more details about our review decision in the Support Inbox. We'll notify people that this information is there and send them a link to it.
If people think we made the wrong decision, they can request another review.
We'll send a final response after we've re-reviewed the content, again to the Support Inbox.
When someone posts something that violates our Community Standards, we'll tell them.
We'll also address common misperceptions around enforcement.
We'll give people easy-to-understand explanations about why their content was removed.
After we've established the context for our decision and explained our policy, we'll ask people what they'd like to do next, including letting us know if they think we made a mistake.
If people disagree with the decision, we'll ask them to tell us more.
Here, we set expectations on what will happen next.
Our global team of over 15,000 reviewers work every day to keep people on Facebook safe.
Outside experts, academics, NGOs and policymakers help inform the Facebook Community Standards.
Learn what you can do if you see something on Facebook that goes against our Community Standards.