Oversight Board recommendations

UPDATED

NOV 16, 2021

In addition to binding decisions on content, the Oversight Board can also issue recommendations for Meta’s content policies and how we enforce our policies on the Facebook app and Instagram.

Meta is committed to both considering these recommendations as important inputs to our internal policy processes and publicly responding to each recommendation within 30 days. Recommendations, unlike the board’s decisions on individual cases, are not binding for Meta.

Below is a table of the recommendations Meta has received from the Oversight Board so far. Here we outline the number of recommendations related to a case, our commitment level, and the implementation status. All additional recommendation updates and details can be found in Meta's Quarterly Update on the Oversight Board.

We categorize our response to the board’s recommendations in the following areas:

  • Implementing fully: We agree with the recommendation and have or will implement it in full.

  • Implementing in part: We agree with the overall aim of the recommendation and have or will implement work related to the board's guidance.

  • Assessing feasibility: We are assessing the feasibility and impact of the recommendation and will provide further updates in the future.

  • Work Meta already does: We have already addressed the Board’s recommendation through an action that we already do. Thus, there is no update required.

  • No further action: We will not implement the recommendation, either due to a lack of feasibility or disagreement about how to reach the desired outcome.

The current status for our responses to the board’s recommendations include the following:

  • Complete: We have completed full or partial implementation in line with our response to the board’s recommendation, and will have no further updates on the recommendation in the future.

  • In progress: We are continuing to make progress on our response to the board’s recommendation, and will have further updates on the recommendation in the future.

  • No further updates: We will not implement the recommendation or have already addressed the recommendation through an action that we already do, and will have no further updates on the recommendation in the future.

Board recommendations

Rec Number
Recommendation
Date of 30-Day Response
Action
Status

2021-011-FB-UA-1

Notify users of the specific rule within the Hate Speech Community Standard that has been violated in the language in which they use Facebook, as recommended in case decision 2020-003-FB-UA (Armenians in Azerbaijan) and case decision 2021-002-FB-UA (Depiction of Zwarte Piet). The Board looks forward to Facebook providing information that confirms implementation for English-language users and information about the timeframe for implementation for other language users.

27 Oct 2021

Implementing in part

In progress

2021-010-FB-UA-1

Publish illustrative examples from the list of slurs it has designated as violating under its Hate Speech Community Standard. These examples should be included in the Community Standard and include edge cases involving words which may be harmful in some contexts but not others, describing when their use would be violating. Facebook should clarify to users that these examples do not constitute a complete list.

27 Oct 2021

Assessing feasibility

In progress

2021-010-FB-UA-2

Link the short explanation of the newsworthiness allowance provided in the introduction to the Community Standards to the more detailed Transparency Center explanation of how this policy applies. The company should supplement this explanation with illustrative examples from a variety of contexts, including reporting on large scale protests.

27 Oct 2021

Implementing fully

In progress

2021-010-FB-UA-3

Develop and publicize clear criteria for content reviewers to escalate for additional review public interest content that potentially violates the Community Standards but may be eligible for the newsworthiness allowance.

27 Oct 2021

Work Meta already does

No further updates

2021-010-FB-UA-4

Notify all users who reported content assessed as violating but left on the platform for public interest reasons that the newsworthiness allowance was applied to the post. The notice should link to the Transparency Center explanation of the newsworthiness allowance.

27 Oct 2021

Assessing feasibility

In progress

2021-009-FB-UA-1

Add criteria and illustrative examples to its Dangerous Individuals and Organizations policy to increase understanding of the exceptions for neutral discussion, condemnation and news reporting.

14 Oct 2021

Assessing feasibility

In progress

2021-009-FB-UA-2

Ensure swift translation of updates to the Community Standards into all available languages.

14 Oct 2021

Assessing feasibility

In progress

2021-009-FB-UA-3

Engage an independent entity not associated with either side of the Israeli-Palestinian conflict to conduct a thorough examination to determine whether Facebook’s content moderation in Arabic and Hebrew, including its use of automation, have been applied without bias. The report and its conclusions should be made public.

14 Oct 2021

Implementing fully

In progress

2021-009-FB-UA-4

Formalize a transparent process on how it receives and responds to all government requests for content removal, and ensure that they are included in transparency reporting. The transparency reporting should distinguish government requests that led to removals for violations of the Community Standards from requests that led to removal or geo-blocking for violating local law, in addition to requests that led to no action.

14 Oct 2021

Implementing in part

In progress

2021-008-FB-FBR-1

Facebook should conduct a proportionality analysis to identify a range of less intrusive measures than removing the content, including labeling content, introducing friction to posts to prevent interactions or sharing, and downranking. All these enforcement measures should be clearly communicated to all users, and subject to appeal.

17 Sept 2021

Work Meta already does

No further updates

2021-008-FB-FBR-2

Given the context of the COVID-19 pandemic, Facebook should make technical arrangements to prioritize fact-checking of potential health misinformation shared by public authorities which comes to the company’s attention, taking into consideration the local context.

17 Sept 2021

Work Meta already does

No further updates

2021-008-FB-FBR-3

Facebook should provide more transparency within the False News Community Standard regarding when content is eligible for fact-checking, including whether public institutions' accounts are subject to fact-checking.

17 Sept 2021

Implementing fully

Complete

2021-007-FB-UA-1

Facebook should ensure that its Internal Implementation Standards are available in the language in which content moderators review content. If necessary to prioritize, Facebook should focus first on contexts where the risks to human rights are more severe.

10 Sept 2021

No further updates

No further updates

2021-006-FB-UA-1

Immediately restore the misplaced 2017 guidance to the Internal Implementation Standards and Known Questions (the internal guidance for content moderators), informing all content moderators that it exists and arranging immediate training on it.

6 Aug 2021

Implementing fully

Complete

2021-006-FB-UA-2

Evaluate automated moderation processes for enforcement of the Dangerous Individuals and Organizations policy and where necessary update classifiers to exclude training data from prior enforcement errors that resulted from failures to apply the 2017 guidance.

6 Aug 2021

No further action

No further updates

2021-006-FB-UA-3

Publish the results of the ongoing review process to determine if any other polices were lost, including descriptions of all lost policies, the period the policies were lost for, and steps taken to restore them.

6 Aug 2021

Implementing in part

In progress

2021-006-FB-UA-4

Reflect in the Dangerous Individuals and Organizations “policy rationale” that respect for human rights and freedom of expression can advance the value of “Safety,” and that it is important for the platform to provide a space for these discussions.

6 Aug 2021

Implementing fully

In progress

2021-006-FB-UA-5

Add to the Dangerous Individuals and Organizations policy a clear explanation of what “support” excludes. Users should be free to discuss alleged violations and abuses of the human rights of members of designated organizations. Calls for accountability for human rights violations and abuses should also be protected.

6 Aug 2021

Implementing fully

In progress

2021-006-FB-UA-6

Explain in the Community Standards how users can make the intent behind their posts clear to Facebook. This would be assisted by implementing the Board’s existing recommendation to publicly disclose the company’s list of designated individuals and organizations (see: case 2020-005-FB-UA). Facebook should also provide illustrative examples to demonstrate the line between permitted and prohibited content, including in relation to the application of the rule clarifying what “support” excludes.

6 Aug 2021

Implementing in part

In progress

2021-006-FB-UA-7

Ensure meaningful stakeholder engagement on the proposed policy change through Facebook’s Product Policy Forum, including through a public call for inputs. Facebook should conduct this engagement in multiple languages across regions, ensuring the effective participation of individuals most impacted by the harms this policy seeks to prevent.

6 Aug 2021

Work Meta already does

No further updates

2021-006-FB-UA-8

Ensure internal guidance and training is provided to content moderators on any new policy. Content moderators should be provided adequate resources to be able to understand the new policy, and adequate time to make decisions when enforcing the policy.

6 Aug 2021

Work Meta already does

No further updates

2021-006-FB-UA-9

Ensure that users are notified when their content is removed. The notification should note whether the removal is due to a government request or due to a violation of the Community Standards or due to a government claiming a national law is violated (and the jurisdictional reach of any removal).

6 Aug 2021

Assessing feasibility

In progress

2021-006-FB-UA-10

Clarify to Instagram users that Facebook’s Community Standards apply to Instagram in the same way they apply to Facebook, in line with the recommendation in case 2020-004-IG-UA.

6 Aug 2021

Implementing fully

In progress

2021-006-FB-UA-11

Include information on the number of requests Facebook receives for content removals from governments that are based on Community Standards violations (as opposed to violations of national law), and the outcome of those requests.

6 Aug 2021

Implementing fully

In progress

2021-006-FB-UA-12

Include more comprehensive information on error rates for enforcing rules on “praise” and “support” of dangerous individuals and organizations, broken down by region and language.

6 Aug 2021

Assessing feasibility

In progress

2021-004-FB-UA-1

Explain the relationship between the policy rationale and the “Do nots” as well as the other rules restricting content that follow it.

25 Jun 2021

Implementing in part

In progress

2021-004-FB-UA-2

Differentiate between bullying and harassment and provide definitions that distinguish the two acts. Further, the Community Standard should clearly explain to users how bullying and harassment differ from speech that only causes offense and may be protected under international human rights law.

25 Jun 2021

Assessing feasibility

In progress

2021-004-FB-UA-3

Clearly define its approach to different target user categories and provide illustrative examples of each target category (i.e. who qualifies as a public figure). Format the Community Standard on Bullying and Harassment by user categories currently listed in the policy.

25 Jun 2021

Implementing fully

In progress

2021-004-FB-UA-4

Include illustrative examples of violating and non-violating content in the Bullying and Harassment Community Standard to clarify the policy lines drawn and how these distinctions can rest on the identity status of the target.

25 Jun 2021

Implementing in part

In progress

2021-004-FB-UA-5

When assessing content including a ‘negative character claim’ against a private adult, Facebook should amend the Community Standard to require an assessment of the social and political context of the content. Facebook should reconsider the enforcement of this rule in political or public debates where the removal of the content would stifle debate.

25 Jun 2021

Assessing feasibility

In progress

2021-004-FB-UA-6

Whenever Facebook removes content because of a negative character claim that is only a single word or phrase in a larger post, it should promptly notify the user of that fact, so that the user can repost the material without the negative character claim.

25 Jun 2021

Assessing feasibility

In progress

2021-005-FB-UA-1

Facebook should make technical arrangements to ensure that notice to users refers to the Community Standard enforced by the company.

17 Jun 2021

Assessing feasibility

In progress

2021-005-FB-UA-2

Facebook should include the satire exception, which is currently not communicated to users, in the public language of the Hate Speech Community Standard.

17 Jun 2021

Implementing fully

In progress

2021-005-FB-UA-3

Facebook should make sure that it has adequate procedures in place to assess satirical content and relevant context properly including by providing content moderators with additional resources.

17 Jun 2021

Implementing in part

In progress

2021-005-FB-UA-4

Facebook should let users indicate in their appeal that their content falls into one of the exceptions to the Hate Speech policy.

17 Jun 2021

Assessing feasibility

In progress

2021-005-FB-UA-5

Facebook should ensure appeals based on policy exceptions are prioritized for human review.

17 Jun 2021

Assessing feasibility

In progress

2021-001-FB-FBR-1

Facebook should act quickly on posts made by influential users that pose a high probability of imminent harm.

4 Jun 2021

Work Meta already does

No further updates

2021-001-FB-FBR-2

Facebook should consider the context of posts by influential users when assessing a post’s risk of harm.

4 Jun 2021

Work Meta already does

No further updates

2021-001-FB-FBR-3

Facebook should prioritize safety over expression when taking action on a threat of harm from influential users.

4 Jun 2021

Work Meta already does

No further updates

2021-001-FB-FBR-4

Facebook should suspend the accounts of high government officials, such as heads of state, if their posts repeatedly pose a risk of harm.

4 Jun 2021

Implementing fully

Complete

2021-001-FB-FBR-5

Facebook should suspend accounts of high government officials, such as heads of state, for a determinate period sufficient to protect against imminent harm. Periods of suspension should be long enough to deter misconduct and may, in appropriate cases, include account or page deletion.

4 Jun 2021

Implementing fully

Complete

2021-001-FB-FBR-6

Facebook should resist pressure from governments to silence their political opposition and consider the relevant political context, including off of Facebook and Instagram, when evaluating political speech from highly influential users.

4 Jun 2021

Work Meta already does

No further updates

2021-001-FB-FBR-7

Facebook should have a process that utilizes regional political and linguistic expertise, with adequate resourcing when evaluating political speech from highly influential users.

4 Jun 2021

Work Meta already does

No further updates

2021-001-FB-FBR-8

Facebook should publicly explain the rules that it uses when it imposes account-level sanctions against influential users.

4 Jun 2021

Implementing Fully

Complete

2021-001-FB-FBR-9

Facebook should assess the on-and-offline risk of harm before lifting an influential user’s account suspension.

4 Jun 2021

Implementing fully

Complete

2021-001-FB-FBR-10

Facebook should document any exceptional processes that apply to influential users.

4 Jun 2021

Implementing fully

Complete

2021-001-FB-FBR-11

Facebook should more clearly explain its newsworthiness allowance.

4 Jun 2021

Implementing fully

In progress

2021-001-FB-FBR-12

In regard to cross check review for influential users, Facebook should clearly explain the rationale, standards, and processes of review, including the criteria to determine which pages and accounts are selected for inclusion.

4 Jun 2021

Implementing fully

Complete

2021-001-FB-FBR-13

Facebook should report on the relative error rates and thematic consistency of determinations made through the cross check process compared with ordinary enforcement procedures.

4 Jun 2021

No further action

No further updates

2021-001-FB-FBR-14

Facebook should review its potential role in the election fraud narrative that sparked violence in the United States on January 6, 2021 and report on its findings.

4 Jun 2021

Work Meta already does

No further updates

2021-001-FB-FBR-15

Facebook should be clear in its Corporate Human Rights policy how it collects, preserves and shares information related to investigations and potential prosecutions, including how researchers can access that information.

4 Jun 2021

Assessing feasibility

In progress

2021-001-FB-FBR-16

Facebook should explain in its Community Standards and Guidelines its strikes and penalties process for restricting profiles, pages, groups and accounts on Facebook and Instagram in a clear, comprehensive, and accessible manner.

4 Jun 2021

Implementing fully

Under investigation

2021-001-FB-FBR-17

Facebook should tell users how many violations, strikes, and penalties they have, as well as the consequences of future violations.

4 Jun 2021

Implementing fully

Complete

2021-001-FB-FBR-18

In its transparency reporting, Facebook should include numbers of profile, page, and account restrictions, including the reason and manner in which enforcement action was taken, with information broken down by region and country.

4 Jun 2021

Assessing feasibility

In progress

2021-001-FB-FBR-19

Facebook should develop and publish a policy that governs its response to crises or novel situations where its regular processes would not prevent or avoid imminent harm.

4 Jun 2021

Implementing fully

In progress

2021-002-FB-UA-1

Facebook should link the rule in the Hate Speech Community Standard prohibiting blackface to the company’s reasoning for the rule, including harms it seeks to prevent.

13 May 2021

Implementing fully

Complete

2021-002-FB-UA-2

In line with the board’s recommendation in the case about Armenians in Azerbaijan, the board said that Facebook should “ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing.” In this case any notice to users should specify the rule on blackface, and also link to the above-mentioned resources that explain the harm this rule seeks to prevent. The board asked Facebook to provide a detailed update on its “feasibility assessment” of the prior recommendations on this topic, including the specific nature of any technical limitations and how these can be overcome.

13 May 2021

Implementing in part

In progress

2021-003-FB-UA-1

Community Standards and Internal Implementation Standards into Punjabi. Facebook should aim to make its Community Standards accessible in all languages widely spoken by its users.

27 May 2021

Implementing in part

In progress

2021-003-FB-UA-2

The company should restore human review and access to a human appeals process to pre-pandemic levels as soon as possible while fully protecting the health of Facebook’s staff and contractors.

27 May 2021

Work Meta already does

No further updates

2021-003-FB-UA-3

Facebook should improve its transparency reporting to increase public information on error rates by making this information viewable by country and language for each Community Standard.

27 May 2021

Assessing feasibility

In progress

2020-007-FB-FBR-1

Provide people with additional information regarding the scope and enforcement of restrictions on veiled threats. This would help people understand what content is allowed in this area. Facebook should make their enforcement criteria public. These should consider the intent and identity of the person, as well as their audience and the wider context.

11 Mar 2021

Implementing in part

Complete

2020-006-FB-FBR-7

In cases where people post information about COVID-19 treatments that contradicts the specific advice of health authorities and where a potential for physical harm is identified but is not imminent, Facebook should adopt a range of less intrusive measures.

25 Feb 2021

No further action

No further updates

2020-006-FB-FBR-6

Conduct a human rights impact assessment with relevant stakeholders as part of its process of rule modification.

25 Feb 2021

Implementing in part

Complete

2020-006-FB-FBR-5

Publish a transparency report on how the Community Standards have been enforced during the COVID-19 global health crisis.

25 Feb 2021

Work Meta already does

No further updates

2020-006-FB-FBR-4

To ensure enforcement measures on health misinformation represent the least intrusive means of protecting public health, Facebook should conduct an assessment of its existing range of tools to deal with health misinformation and consider the potential for development of further tools that are less intrusive than content removals.

25 Feb 2021

Implementing in part

Complete

2020-006-FB-FBR-3

To ensure enforcement measures on health misinformation represent the least intrusive means of protecting public health, Facebook should clarify the particular harms it is seeking to prevent and provide transparency about how it will assess the potential harm of particular content.

25 Feb 2021

Implementing in part

Complete

2020-006-FB-FBR-2

Facebook should 1) publish its range of enforcement options within the Community Standards, ranking these options from most to least intrusive based on how they infringe freedom of expression, 2) explain what factors, including evidence-based criteria, the platform will use in selecting the least intrusive option when enforcing its Community Standards to protect public health and 3) make clear within the Community Standards what enforcement option applies to each policy.

25 Feb 2021

Implementing in part

Complete

2020-006-FB-FBR-1

Clarify the Community Standards with respect to health misinformation, particularly with regard to COVID-19. Facebook should set out a clear and accessible policy on health misinformation, consolidating and clarifying existing policies in one place.

25 Feb 2021

Implementing in part

Complete

2020-005-FB-UA-3

Provide a public list of the organizations and individuals designated “dangerous” under the policy on dangerous individuals and organizations.

25 Feb 2021

No further action

No further updates

2020-005-FB-UA-2

Explain and provide examples of the application of key terms used in the Dangerous Individuals and Organizations policy. These should align with the definitions used in Facebook’s Internal Implementation Standards.

25 Feb 2021

Implemented in part

Complete

2020-005-FB-UA-1

Ensure that users are always notified of the Community Standards Facebook is enforcing.

25 Feb 2021

Implemented in part

In progress

2020-003-FB-UA-1

Go beyond the policy that Facebook is enforcing, and add more specifics about what part of the Facebook Community Standards they violated.

25 Feb 2021

Implementing in part

In progress

2020-004-IG-UA-6

Expand transparency reporting to disclose data on the number of automated removal decisions, and the proportion of those decisions subsequently reversed following human review.

25 Feb 2021

Assessing feasibility

In progress

2020-004-IG-UA-5

Inform people when automation is used to take enforcement action against their content, including accessible descriptions of what this means.

25 Feb 2021

Implementing fully

In progress

2020-004-IG-UA-4

Ensure people can appeal decisions taken by automated systems to human review when their content is found to have violated the policy on adult nudity and sexual activity.

25 Feb 2021

Implementing fully

Complete

2020-004-IG-UA-3

When communicating to people about how they violated policies, be clear about the relationship between the Community Guidelines and Community Standards.

25 Feb 2021

Implementing in part

In progress

2020-004-IG-UA-2

Revise the Instagram Community Guidelines to specify that female nipples can be shown to raise breast cancer awareness and clarify that where there are inconsistencies between the Community Guidelines and the Community Standards, the latter take precedence.

25 Feb 2021

Implementing fully

In progress

2020-004-IG-UA-1

Improve automated detection of images with text overlay so that posts raising awareness of breast cancer symptoms are not wrongly flagged for review. Facebook should also improve its transparency reporting on its use of automated enforcement.

25 Feb 2021

Implementing fully

Complete