Bringing Local Context to Our Global Standards

UPDATED

JAN 28, 2022

Civil society organizations are critical partners in helping Meta better understand the impact of our platforms and the context of the diverse communities in which we operate around the globe. We’ve built the Trusted Partner program to foster this partnership with civil society, strengthen the social media monitoring capacity of local organizations and improve our policies, enforcement processes, and products to help keep users safe on our platforms. Our network of Trusted Partners includes over 400 non-governmental organizations, humanitarian agencies, human rights defenders and researchers from 113 countries around the globe. From local organizations such as Tech4Peace in Iraq and Defy Hate Now in South Sudan to international organizations like Internews, our partners bring a wealth of knowledge and experience to help inform our content moderation efforts. Meta provides Trusted Partners with funding to support our shared goals of keeping harmful content off our platforms and helping to prevent risk offline.

We partner with expert organizations that represent the voices and experiences of at-risk users around the globe and are equipped to raise questions and concerns about content on Facebook and Instagram to:

  • Address problematic content trends and prevent harm

  • Foster online safety and security

  • Inform the development of effective and transparent policies

In addition to reporting content, Trusted Partners provide crucial feedback on our content policies and enforcement to help ensure that our efforts keep users safe. Our Trusted Partners’ subject matter and regional expertise help strengthen our policies by enabling us to consider a range of perspectives to inform our content moderation efforts. For example, we seek to understand harmful misinformation by consulting with Trusted Partners to gain insight at the local level into what conditions and in what context certain forms of misinformation may contribute to the risk of imminent violence or physical harm, escalate social tensions, trigger violence, or undermine democratic processes. These consultations enable us to develop tailored policies that help keep our community safe during times of crisis.

In selecting our Trusted Partners, we seek organizations that have experience in social media monitoring, an interest in learning about our content policies, demonstrate a commitment to keeping online communities safe, and represent marginalized groups who are disproportionately affected by harmful content.

We are grateful for the partnerships we have with expert civil society organizations that help us to better understand local context, trends in speech, and signals of imminent harm.