Bringing local context to our global standards

UPDATED

JAN 18, 2023

Civil society organisations are critical partners in helping Meta better understand the impact of our platforms and the context of the diverse communities in which we operate around the globe. We've built the trusted partner programme to foster this partnership with civil society, strengthen the social media monitoring capacity of local organisations and improve our policies, enforcement processes and products to help keep users safe on our platforms. Our network of trusted partners includes over 400 non-governmental organisations, humanitarian agencies, human rights defenders and researchers from 113 countries around the globe. From local organisations such as Tech4Peace in Iraq and Defy Hate Now in South Sudan to international organisations such as Internews, our partners bring a wealth of knowledge and experience to help inform our content moderation efforts. Meta provides trusted partners with funding to support our shared goals of keeping harmful content off our platforms and helping to prevent risk offline.

We partner with expert organisations that represent the voices and experiences of at-risk users around the globe and are equipped to raise questions and concerns about content on Facebook and Instagram to:

  • Address problematic content trends and prevent harm

  • Foster online safety and security

  • Inform the development of effective and transparent policies

In addition to reporting content, trusted partners provide crucial feedback on our Content Policies and enforcement to help ensure that our efforts keep users safe. Our trusted partners' subject matter and regional expertise help strengthen our policies by enabling us to consider a range of perspectives to inform our content moderation efforts. For example, we seek to understand harmful misinformation by consulting with trusted partners to gain insight at the local level into what conditions and in what context certain forms of misinformation may contribute to the risk of imminent violence or physical harm, escalate social tensions, trigger violence or undermine democratic processes. These consultations enable us to develop tailored policies that help keep our community safe during times of crisis.

In selecting our trusted partners, we seek organisations that have experience in social media monitoring, an interest in learning about our Content Policies, demonstrate a commitment to keeping online communities safe and represent marginalised groups who are disproportionately affected by harmful content.

We are grateful for the partnerships that we have with expert civil society organisations that help us to better understand local context, trends in speech and signals of imminent harm.