How Facebook reviews content enforcement issues

UPDATED

JUN 11, 2021

Facebook reviews content enforcement issues on a weekly basis to consider whether a different outcome would have been better. These reviews use the following process:

Content enforcement issue

Is there a problem with the functionality of the product?
YES
EXAMPLE

We don’t have a reporting option on Messenger.

ACTION

Work with the Product team to ensure we have a reporting option across all messaging surfaces.

NO
YES
EXAMPLE

We don’t have a reporting option on Messenger.

ACTION

Work with the Product team to ensure we have a reporting option across all messaging surfaces.

Is there a problem with the way the policy is written?
YES
EXAMPLE

A type of problematic content is not reflected in the Facebook Community Standards.

ACTION

Work with external experts to rewrite or update the policy.

NO
YES
EXAMPLE

A type of problematic content is not reflected in the Facebook Community Standards.

ACTION

Work with external experts to rewrite or update the policy.

Is there a problem with the way we enforced the policy?
YES
EXAMPLE

We have a general policy against slurs, but the exact list of slurs is missing something.

ACTION

Update our internal documentation and train reviewers to ensure we can effectively enforce.

NO
YES
EXAMPLE

We have a general policy against slurs, but the exact list of slurs is missing something.

ACTION

Update our internal documentation and train reviewers to ensure we can effectively enforce.

Is there a problem with the way we reviewed the content?
YES
EXAMPLE

Our technology isn’t doing enough to understand the sentiment of image and text together.

ACTION

Invest in more machine learning to improve our classifiers.

NO
YES
EXAMPLE

Our technology isn’t doing enough to understand the sentiment of image and text together.

ACTION

Invest in more machine learning to improve our classifiers.

People disagree with the outcome