How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
Background information about our cross-check system can be found in our Transparency Center.
In this policy advisory opinion referral, Meta is asking for guidance on the following questions:
Because of the complexities of content moderation at scale, how should Meta balance its desire to fairly and objectively apply our Community Standards with our need for flexibility, nuance, and context-specific decisions within cross-check?
What improvements should Meta make to how we govern our Early Response (“ER”) Secondary Review cross-check system to fairly enforce our Community Standards while minimizing the potential for over-enforcement, retaining business flexibility, and promoting transparency in the review process?
What criteria should Meta use to determine who is included in ER Secondary Review and prioritized as one of many factors by our cross-check ranker in order.
Once the board has finished deliberating, we will consider and publicly respond to its recommendations within 30 days, and will update this post accordingly. Please see the board’s website for the recommendations when they issue them.