The principles that guide Meta’s stakeholder engagement

UPDATED

JAN 26, 2022

Our commitment to stakeholder engagement means addressing a number of essential questions, such as: How does Meta decide who to engage with? How do we find relevant experts? How do we make sure that vulnerable groups are heard?

There’s no simple formula for responding to these questions. But we’ve developed a structure and methodology to engage stakeholders centered around three core principles: inclusiveness, expertise and transparency

Inclusiveness

Stakeholder engagement broadens our perspective and creates a more inclusive approach to policymaking.

Stakeholder engagement helps us better understand how our policies impact people and organizations. When we make decisions about what content to remove and what to leave up, we affect how people communicate with each other on Facebook. Not everyone will agree on where we draw the lines. But at a minimum, we need to understand the concerns of those who are impacted by our policies, whether they agree or disagree with them.

It’s particularly important that we hear the voices of stakeholders from marginalized communities. That’s why we reach out to a broad spectrum of stakeholders across the world. It's not enough to ask how our policies affect “people in general.” We need to understand how our policies will impact people who are particularly vulnerable by virtue of laws, cultural practices, poverty or other reasons that prevent them from speaking up for their rights.

The question of our impact plays out in many ways. While our policies are global, they affect people on a very personal level. Our policy development needs to reflect cultural sensitivity and a deep understanding of local context.

Stakeholder engagement gives us a tool to deepen our local knowledge and perspective so we can hear the voices across the policy spectrum we might otherwise miss.

Of course, it’s not always self-evident what “the spectrum” is. In many cases, our policies don't line up neatly with traditional political dichotomies, such as liberal versus conservative or civil libertarian versus control by the State. We talk to others in Meta's Policy and Research organizations and conduct our own research to identify a range of diverse stakeholders.

For example, in considering how our hate speech policy should apply to certain forms of gendered language, we spoke with academic experts, women's and digital rights groups and free speech advocates. Likewise, when considering our policy on adult nudity and sexual activity in art, we listened to family safety organizations, as well as artists and museum curators. In reviewing how our policies should apply to memorialized profiles of deceased people, we connected with both professors who study digital legacy as an academic subject as well as people on Facebook who've been designated as “Legacy Contacts” and who have real world experience with this product feature.

In our stakeholder mapping, we also seek input from minority groups that have traditionally lacked power throughout the world, such as political dissidents and religious minorities. For example, in reevaluating how our hate speech policy applies to certain behavioral generalizations, we consulted with immigrants rights groups.

Expertise

Stakeholder engagement brings expertise to our policy development process.

The Stakeholder Engagement team conducts research to gather input from top subject matter experts for a given policy. This ensures our policy-making process is informed by current theories and analysis, empirical research and an understanding of the latest online trends. The expertise we gather references issues of language, social identity and geography, all of which bear on our policies in important ways.

Our policies are entwined with many complex social and technological issues, such as hate speech, terrorism, bullying and harassment and threats of violence. Sometimes we're looking for guidance on how safety and voice should be balanced, such as considering what types of speech to allow about “public figures” under our policies. In other cases, we're reaching out to gain specialized knowledge, such as how our policies can draw on international human rights principles or how minority communities may experience certain types of speech.

Sometimes the challenges we face are new, even to the experts we consult with. But by talking with outside experts and incorporating their feedback, we make our policies more thoughtful.

For example, our hate speech policy recognizes three tiers of attacks. Tier 1, the most severe, involves calls to violence or dehumanizing speech against other people based on their race, ethnicity, nationality, gender or other protected characteristic (for example, “Kill the Christians”). Tier 2 attacks consist of statements of inferiority or expressions of contempt or disgust (for example, “Mexicans are lazy”). And Tier 3 covers calls to exclude or segregate (for example, “No women allowed”).

These tiers make our policies more nuanced and precise. On the basis of the tiers, we're able to provide additional protections against the most harmful forms of speech. For instance, we remove Tier 1 hate speech directed against immigrants (for example, “Immigrants are rats”) but permit less intense forms of speech (for example, “Immigrants should stay out of our country”) to leave room for broad political discourse.

As part of our policy development work in this area, we spoke with outside experts—academics, NGOs that study hate speech and groups all across the political landscape. This stakeholder engagement helped confirm the tiers were comprehensive and aligned with patterns of online and offline behavior.

Transparency

Stakeholder engagement makes our policies and our policy development process more transparent.

We know from talking to hundreds of stakeholders that we can build trust by making sure our policy-development process is open. The more visibility we provide, the more our stakeholders are likely to view our policies as legitimate. Transparency with our stakeholder engagement process helps us build a system of rules and enforcement that people regard as fair.

Engagement also means being open about the challenges of moderating content, as well as explaining the rationale behind our policies and why there may be a need for improvement. In turn, the policies we launch will be better by virtue of having been tested through consultation and a candid exchange of views.