Oversight Board recommendations

UPDATED

OCT 14, 2021

In addition to binding decisions on content, the Oversight Board can also issue recommendations for Facebook’s content policies and how we enforce our policies on the Facebook app and Instagram.

Facebook is committed to both considering these recommendations as important inputs to our internal policy processes and publicly responding to each recommendation within 30 days. Recommendations, unlike the board’s decisions on individual cases, are not binding for Facebook.

Board recommendations

Rec Number
Recommendation
Date
Action
Status

2021-009-FB-UA-1

Add criteria and illustrative examples to its Dangerous Individuals and Organizations policy to increase understanding of the exceptions for neutral discussion, condemnation and news reporting.

14 Oct 2021

Assessing feasibility

Under investigation

2021-009-FB-UA-2

Ensure swift translation of updates to the Community Standards into all available languages.

14 Oct 2021

Assessing feasibility

Under investigation

2021-009-FB-UA-3

Engage an independent entity not associated with either side of the Israeli-Palestinian conflict to conduct a thorough examination to determine whether Facebook’s content moderation in Arabic and Hebrew, including its use of automation, have been applied without bias. The report and its conclusions should be made public.

14 Oct 2021

Implementing fully

Under investigation

2021-009-FB-UA-4

Formalize a transparent process on how it receives and responds to all government requests for content removal, and ensure that they are included in transparency reporting. The transparency reporting should distinguish government requests that led to removals for violations of the Community Standards from requests that led to removal or geo-blocking for violating local law, in addition to requests that led to no action.

14 Oct 2021

Implementing in part

Under investigation

2021-008-FB-FBR-1

Facebook should provide more transparency within the False News Community Standard regarding when content is eligible for fact-checking, including whether public institutions' accounts are subject to fact-checking.

17 Sept 2021

Implementing fully

Under investigation

2021-008-FB-FBR-2

Given the context of the COVID-19 pandemic, Facebook should make technical arrangements to prioritize fact-checking of potential health misinformation shared by public authorities which comes to the company’s attention, taking into consideration the local context.

17 Sept 2021

Work Facebook already does

No further updates

2021-008-FB-FBR-3

Facebook should conduct a proportionality analysis to identify a range of less intrusive measures than removing the content, including labeling content, introducing friction to posts to prevent interactions or sharing, and downranking. All these enforcement measures should be clearly communicated to all users, and subject to appeal.

17 Sept 2021

Work Facebook already does

No further updates

2021-007-FB-UA-1

Facebook should ensure that its Internal Implementation Standards are available in the language in which content moderators review content. If necessary to prioritize, Facebook should focus first on contexts where the risks to human rights are more severe.

10 Sept 2021

No further updates

Under investigation

2021-006-FB-UA-1

Immediately restore the misplaced 2017 guidance to the Internal Implementation Standards and Known Questions (the internal guidance for content moderators), informing all content moderators that it exists and arranging immediate training on it.

6 Aug 2021

Implementing fully

Under investigation

2021-006-FB-UA-2

Evaluate automated moderation processes for enforcement of the Dangerous Individuals and Organizations policy and where necessary update classifiers to exclude training data from prior enforcement errors that resulted from failures to apply the 2017 guidance.

6 Aug 2021

No further action

Under investigation

2021-006-FB-UA-3

Publish the results of the ongoing review process to determine if any other polices were lost, including descriptions of all lost policies, the period the policies were lost for, and steps taken to restore them.

6 Aug 2021

Implementing in part

Under investigation

2021-006-FB-UA-4

Reflect in the Dangerous Individuals and Organizations “policy rationale” that respect for human rights and freedom of expression can advance the value of “Safety,” and that it is important for the platform to provide a space for these discussions.

6 Aug 2021

Implementing fully

Under investigation

2021-006-FB-UA-5

Add to the Dangerous Individuals and Organizations policy a clear explanation of what “support” excludes. Users should be free to discuss alleged violations and abuses of the human rights of members of designated organizations. Calls for accountability for human rights violations and abuses should also be protected.

6 Aug 2021

Implementing fully

Under investigation

2021-006-FB-UA-6

Explain in the Community Standards how users can make the intent behind their posts clear to Facebook. This would be assisted by implementing the Board’s existing recommendation to publicly disclose the company’s list of designated individuals and organizations (see: case 2020-005-FB-UA). Facebook should also provide illustrative examples to demonstrate the line between permitted and prohibited content, including in relation to the application of the rule clarifying what “support” excludes.

6 Aug 2021

Implementing in part

Under investigation

2021-006-FB-UA-7

Ensure meaningful stakeholder engagement on the proposed policy change through Facebook’s Product Policy Forum, including through a public call for inputs. Facebook should conduct this engagement in multiple languages across regions, ensuring the effective participation of individuals most impacted by the harms this policy seeks to prevent.

6 Aug 2021

Work Facebook already does

Under investigation

2021-006-FB-UA-8

Ensure internal guidance and training is provided to content moderators on any new policy. Content moderators should be provided adequate resources to be able to understand the new policy, and adequate time to make decisions when enforcing the policy.

6 Aug 2021

Work Facebook already does

Under investigation

2021-006-FB-UA-9

Ensure that users are notified when their content is removed. The notification should note whether the removal is due to a government request or due to a violation of the Community Standards or due to a government claiming a national law is violated (and the jurisdictional reach of any removal).

6 Aug 2021

Implementing fully

Under investigation

2021-006-FB-UA-10

Clarify to Instagram users that Facebook’s Community Standards apply to Instagram in the same way they apply to Facebook, in line with the recommendation in case 2020-004-IG-UA.

6 Aug 2021

Implementing fully

Under investigation

2021-006-FB-UA-11

Include information on the number of requests Facebook receives for content removals from governments that are based on Community Standards violations (as opposed to violations of national law), and the outcome of those requests.

6 Aug 2021

Implementing fully

Under investigation

2021-006-FB-UA-12

Include more comprehensive information on error rates for enforcing rules on “praise” and “support” of dangerous individuals and organizations, broken down by region and language.

6 Aug 2021

Assessing feasibility

Under investigation

2021-004-FB-UA-1

Explain the relationship between the policy rationale and the “Do nots” as well as the other rules restricting content that follow it.

25 Jun 2021

Implementing in part

Under investigation

2021-004-FB-UA-2

Differentiate between bullying and harassment and provide definitions that distinguish the two acts. Further, the Community Standard should clearly explain to users how bullying and harassment differ from speech that only causes offense and may be protected under international human rights law.

25 Jun 2021

Assessing feasibility

Under investigation

2021-004-FB-UA-3

Clearly define its approach to different target user categories and provide illustrative examples of each target category (i.e. who qualifies as a public figure). Format the Community Standard on Bullying and Harassment by user categories currently listed in the policy.

25 Jun 2021

Implementing fully

Under investigation

2021-004-FB-UA-4

Include illustrative examples of violating and non-violating content in the Bullying and Harassment Community Standard to clarify the policy lines drawn and how these distinctions can rest on the identity status of the target.

25 Jun 2021

Implementing in part

Under investigation

2021-004-FB-UA-5

When assessing content including a ‘negative character claim’ against a private adult, Facebook should amend the Community Standard to require an assessment of the social and political context of the content. Facebook should reconsider the enforcement of this rule in political or public debates where the removal of the content would stifle debate.

25 Jun 2021

Assessing feasibility

Under investigation

2021-004-FB-UA-6

Whenever Facebook removes content because of a negative character claim that is only a single word or phrase in a larger post, it should promptly notify the user of that fact, so that the user can repost the material without the negative character claim.

25 Jun 2021

Assessing feasibility

Under investigation

2021-005-FB-UA-1

Facebook should make technical arrangements to ensure that notice to users refers to the Community Standard enforced by the company.

17 Jun 2021

Assessing feasibility

Under investigation

2021-005-FB-UA-2

Facebook should include the satire exception, which is currently not communicated to users, in the public language of the Hate Speech Community Standard.

17 Jun 2021

Implementing fully

Under investigation

2021-005-FB-UA-3

Facebook should make sure that it has adequate procedures in place to assess satirical content and relevant context properly including by providing content moderators with additional resources.

17 Jun 2021

Assessing feasibility

Under investigation

2021-005-FB-UA-4

Facebook should let users indicate in their appeal that their content falls into one of the exceptions to the Hate Speech policy.

17 Jun 2021

Assessing feasibility

Under investigation

2021-005-FB-UA-5

Facebook should ensure appeals based on policy exceptions are prioritized for human review.

17 Jun 2021

Assessing feasibility

Under investigation

2021-001-FB-FBR-1

Facebook should act quickly on posts made by influential users that pose a high probability of imminent harm.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-2

Facebook should consider the context of posts by influential users when assessing a post’s risk of harm.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-3

Facebook should prioritize safety over expression when taking action on a threat of harm from influential users.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-4

Facebook should suspend the accounts of high government officials, such as heads of state, if their posts repeatedly pose a risk of harm.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-5

Facebook should suspend accounts of high government officials, such as heads of state, for a determinate period sufficient to protect against imminent harm. Periods of suspension should be long enough to deter misconduct and may, in appropriate cases, include account or page deletion.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-6

Facebook should resist pressure from governments to silence their political opposition and consider the relevant political context, including off of Facebook and Instagram, when evaluating political speech from highly influential users.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-7

Facebook should have a process that utilizes regional political and linguistic expertise, with adequate resourcing when evaluating political speech from highly influential users.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-8

Facebook should publicly explain the rules that it uses when it imposes account-level sanctions against influential users.

4 Jun 2021

Implementing Fully

Under investigation

2021-001-FB-FBR-9

Facebook should assess the on-and-offline risk of harm before lifting an influential user’s account suspension.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-10

Facebook should document any exceptional processes that apply to influential users.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-11

Facebook should more clearly explain its newsworthiness allowance.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-12

In regard to cross check review for influential users, Facebook should clearly explain the rationale, standards, and processes of review, including the criteria to determine which pages and accounts are selected for inclusion.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-13

Facebook should report on the relative error rates and thematic consistency of determinations made through the cross check process compared with ordinary enforcement procedures.

4 Jun 2021

No further action

Under investigation

2021-001-FB-FBR-14

Facebook should review its potential role in the election fraud narrative that sparked violence in the United States on January 6, 2021 and report on its findings.

4 Jun 2021

Implementing in part

Under investigation

2021-001-FB-FBR-15

Facebook should be clear in its Corporate Human Rights policy how it collects, preserves and shares information related to investigations and potential prosecutions, including how researchers can access that information.

4 Jun 2021

Assessing feasibility

Under investigation

2021-001-FB-FBR-16

Facebook should explain in its Community Standards and Guidelines its strikes and penalties process for restricting profiles, pages, groups and accounts on Facebook and Instagram in a clear, comprehensive, and accessible manner.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-17

Facebook should tell users how many violations, strikes, and penalties they have, as well as the consequences of future violations.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-18

In its transparency reporting, Facebook should include numbers of profile, page, and account restrictions, including the reason and manner in which enforcement action was taken, with information broken down by region and country.

4 Jun 2021

Assessing feasibility

Under investigation

2021-001-FB-FBR-19

Facebook should develop and publish a policy that governs its response to crises or novel situations where its regular processes would not prevent or avoid imminent harm.

4 Jun 2021

Implementing fully

Under investigation

2021-003-FB-UA-1

Community Standards and Internal Implementation Standards into Punjabi. Facebook should aim to make its Community Standards accessible in all languages widely spoken by its users.

27 May 2021

Committed to action

Under investigation

2021-003-FB-UA-2

The company should restore human review and access to a human appeals process to pre-pandemic levels as soon as possible while fully protecting the health of Facebook’s staff and contractors.

27 May 2021

Committed to action

Under investigation

2021-003-FB-UA-3

Facebook should improve its transparency reporting to increase public information on error rates by making this information viewable by country and language for each Community Standard.

27 May 2021

Assessing feasibility

Under investigation

2020-007-FB-FBR-1

Provide people with additional information regarding the scope and enforcement of restrictions on veiled threats. This would help people understand what content is allowed in this area. Facebook should make their enforcement criteria public. These should consider the intent and identity of the person, as well as their audience and the wider context.

11 Mar 2021

Implemented in part

No further updates

2020-006-FB-FBR-7

In cases where people post information about COVID-19 treatments that contradicts the specific advice of health authorities and where a potential for physical harm is identified but is not imminent, Facebook should adopt a range of less intrusive measures.

25 Feb 2021

No further action

Closed

2020-006-FB-FBR-6

Conduct a human rights impact assessment with relevant stakeholders as part of its process of rule modification.

25 Feb 2021

Implemented in part

No further updates

2020-006-FB-FBR-5

Publish a transparency report on how the Community Standards have been enforced during the COVID-19 global health crisis.

25 Feb 2021

Implementing in part

In progress

2020-006-FB-FBR-4

To ensure enforcement measures on health misinformation represent the least intrusive means of protecting public health, Facebook should conduct an assessment of its existing range of tools to deal with health misinformation and consider the potential for development of further tools that are less intrusive than content removals.

25 Feb 2021

Committed to action

No further updates

2020-006-FB-FBR-3

To ensure enforcement measures on health misinformation represent the least intrusive means of protecting public health, Facebook should clarify the particular harms it is seeking to prevent and provide transparency about how it will assess the potential harm of particular content.

25 Feb 2021

Implementing in part

No further updates

2020-006-FB-FBR-2

Facebook should 1) publish its range of enforcement options within the Community Standards, ranking these options from most to least intrusive based on how they infringe freedom of expression, 2) explain what factors, including evidence-based criteria, the platform will use in selecting the least intrusive option when enforcing its Community Standards to protect public health and 3) make clear within the Community Standards what enforcement option applies to each policy.

25 Feb 2021

Implemented in part

No further updates

2020-006-FB-FBR-1

Clarify the Community Standards with respect to health misinformation, particularly with regard to COVID-19. Facebook should set out a clear and accessible policy on health misinformation, consolidating and clarifying existing policies in one place.

25 Feb 2021

Implemented in part

No further updates

2020-005-FB-UA-3

Provide a public list of the organizations and individuals designated “dangerous” under the policy on dangerous individuals and organizations.

25 Feb 2021

Assessing feasibility

In progress

2020-005-FB-UA-2

Explain and provide examples of the application of key terms used in the Dangerous Individuals and Organizations policy. These should align with the definitions used in Facebook’s Internal Implementation Standards.

25 Feb 2021

Implemented in part

No further updates

2020-005-FB-UA-1

Ensure that users are always notified of the Community Standards Facebook is enforcing.

25 Feb 2021

Implemented in part

No further updates

2020-003-FB-UA-1

Go beyond the policy that Facebook is enforcing, and add more specifics about what part of the Facebook Community Standards they violated.

25 Feb 2021

Implementing in part

In progress

2020-004-IG-UA-6

Expand transparency reporting to disclose data on the number of automated removal decisions, and the proportion of those decisions subsequently reversed following human review.

25 Feb 2021

Assessing Feasibility

In progress

2020-004-IG-UA-5

Inform people when automation is used to take enforcement action against their content, including accessible descriptions of what this means.

25 Feb 2021

Assessing Feasibility

In progress

2020-004-IG-UA-4

Ensure people can appeal decisions taken by automated systems to human review when their content is found to have violated the policy on adult nudity and sexual activity.

25 Feb 2021

Implementing fully

No further updates

2020-004-IG-UA-3

When communicating to people about how they violated policies, be clear about the relationship between the Community Guidelines and Community Standards.

25 Feb 2021

Implementing in part

No further updates

2020-004-IG-UA-2

Revise the Instagram Community Guidelines to specify that female nipples can be shown to raise breast cancer awareness and clarify that where there are inconsistencies between the Community Guidelines and the Community Standards, the latter take precedence.

25 Feb 2021

Implementing fully

In progress

2020-004-IG-UA-1

Improve automated detection of images with text overlay so that posts raising awareness of breast cancer symptoms are not wrongly flagged for review. Facebook should also improve its transparency reporting on its use of automated enforcement.

25 Feb 2021

Implementing fully

In progress