Bringing local context to our global standards

UPDATED

JAN 18, 2023

Civil society organizations are critical partners in helping Meta better understand the impact of our platforms and the context of the diverse communities in which we operate around the globe.

In recognition of this expertise, we regularly engage with civil society organizations around the world (which may be global, regional, or local in scope) as part of our policy development process. In addition, we partner with a subset of civil society organizations through our Trusted Partner program to foster deeper collaboration with specific local organizations and to strengthen their social media monitoring capacity. The Trusted Partner program is a key part of our efforts to improve our policies, enforcement processes, and products, to help keep users safe on our platforms.

Our network of Trusted Partners includes over 400 non-governmental organizations, humanitarian agencies, human rights defenders and researchers from 113 countries around the globe. From local organizations such as Tech4Peace in Iraq and Defy Hate Now in South Sudan, to international organizations like Internews, our partners bring a wealth of knowledge and experience to help inform our content moderation efforts. We partner with expert organizations that represent the voices and experiences of marginalized users around the globe and are equipped to raise questions and concerns about content on Facebook and Instagram to:

  • Address problematic content trends and prevent harm

  • Foster online safety and security

  • Inform the development of effective and transparent policies

In addition to reporting content, Trusted Partners provide crucial feedback on our content policies and enforcement to help ensure that our efforts keep users safe. Our Trusted Partners’ subject matter and regional expertise help strengthen our policies by enabling us to consider a range of perspectives to inform our content moderation efforts. For example, we seek to understand harmful misinformation by consulting with Trusted Partners to gain insight at the local level into what conditions and in what context certain forms of misinformation may contribute to the risk of imminent violence or physical harm, escalate social tensions, trigger violence, or undermine democratic processes. These consultations enable us to develop tailored policies that help keep our community safe during times of crisis.

In selecting our Trusted Partners, we seek organizations that have experience in social media monitoring, an interest in learning about our content policies, and a commitment to keeping online communities safe.

We are grateful for the partnerships we have with expert civil society organizations that help us to better understand local context, trends in speech, and signals of imminent harm.