Policies that outline what is and isn't allowed on the Facebook app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
Civil society organizations are critical partners in helping Meta better understand the impact of our platforms and the context of the diverse communities in which we operate around the globe.
In recognition of this expertise, we regularly engage with civil society organizations around the world (which may be global, regional, or local in scope) as part of our policy development process. In addition, we partner with a subset of civil society organizations through our Trusted Partner program to foster deeper collaboration with specific local organizations and to strengthen their social media monitoring capacity. The Trusted Partner program is a key part of our efforts to improve our policies, enforcement processes, and products, to help keep users safe on our platforms.
Our network of Trusted Partners includes over 400 non-governmental organizations, humanitarian agencies, human rights defenders and researchers from 113 countries around the globe. From local organizations such as Tech4Peace in Iraq and Defy Hate Now in South Sudan, to international organizations like Internews, our partners bring a wealth of knowledge and experience to help inform our content moderation efforts. We partner with expert organizations that represent the voices and experiences of marginalized users around the globe and are equipped to raise questions and concerns about content on Facebook and Instagram to:
Address problematic content trends and prevent harm
Foster online safety and security
Inform the development of effective and transparent policies
In addition to reporting content, Trusted Partners provide crucial feedback on our content policies and enforcement to help ensure that our efforts keep users safe. Our Trusted Partners’ subject matter and regional expertise help strengthen our policies by enabling us to consider a range of perspectives to inform our content moderation efforts. For example, we seek to understand harmful misinformation by consulting with Trusted Partners to gain insight at the local level into what conditions and in what context certain forms of misinformation may contribute to the risk of imminent violence or physical harm, escalate social tensions, trigger violence, or undermine democratic processes. These consultations enable us to develop tailored policies that help keep our community safe during times of crisis.
In selecting our Trusted Partners, we seek organizations that have experience in social media monitoring, an interest in learning about our content policies, and a commitment to keeping online communities safe.
We are grateful for the partnerships we have with expert civil society organizations that help us to better understand local context, trends in speech, and signals of imminent harm.