At Meta, we’re committed to giving people a voice and keeping them safe.

Since 2016, we've used a strategy called "remove, reduce, inform" to manage content across Meta technologies.

This means we remove harmful content that goes against our policies, reduce the distribution of problematic content that doesn’t violate our policies, and inform people with additional context so they can decide what to click, read or share.

To help with this strategy, we have policies that describe what is and isn’t allowed on our technologies. Our teams work together to develop our policies and enforce them. Here’s how it works.

1

We collaborate with global experts in technology, public safety and human rights to create and update our policies.

How Meta improves

2

We build features for safety, so people can report content and block, hide or unfollow accounts.

Taking action

3

We enforce our policies using technology and human review.

Detecting violations

We keep people safe and let people hold us accountable by sharing our policies, enforcement and transparency reports.

Recent updates