How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
JUN 23, 2021
Facebook’s review teams are trained to ensure that their content decisions are accurate and consistent, based on the policies outlined in the Facebook Community Standards or Instagram Community Guidelines. This is especially important when people widely share potentially violating content on Facebook or Instagram, and we endeavor to make the right decision on this content due to the number of people who could see it.
In these instances, we may employ additional reviews for high-visibility content that may violate our policies—for example, reporting from a war zone with graphic imagery that a closely-followed news source shares. This process, which we refer to as cross-check, means that our review teams will assess this content multiple times.
These additional reviews are a supplemental safeguard to ensure we’re accurately taking action on potentially violating content that more people see. It also helps us verify that when content violates our policies, including from public figures or popular Pages, we consistently remove it.