Oversight Board recommendations

UPDATED

JUN 11, 2021

In addition to binding decisions on content, the Oversight Board can also issue recommendations for Facebook’s content policies and how we enforce our policies on the Facebook app and Instagram.

Facebook is committed to both considering these recommendations as important inputs to our internal policy processes and publicly responding to each recommendation within 30 days. Recommendations, unlike the board’s decisions on individual cases, are not binding for Facebook.

Board recommendations

Recommendation
Date
Action
Status

Facebook should act quickly on posts made by influential users that pose a high probability of imminent harm.

4 Jun 2021

Implementing fully

Under investigation

Facebook should consider the context of posts by influential users when assessing a post’s risk of harm.

4 Jun 2021

Implementing fully

Under investigation

Facebook should prioritize safety over expression when taking action on a threat of harm from influential users.

4 Jun 2021

Implementing fully

Under investigation

Facebook should suspend the accounts of high government officials, such as heads of state, if their posts repeatedly pose a risk of harm.

4 Jun 2021

Implementing fully

Under investigation

Facebook should suspend accounts of high government officials, such as heads of state, for a determinate period sufficient to protect against imminent harm. Periods of suspension should be long enough to deter misconduct and may, in appropriate cases, include account or page deletion.

4 Jun 2021

Implementing fully

Under investigation

Facebook should resist pressure from governments to silence their political opposition and consider the relevant political context, including off of Facebook and Instagram, when evaluating political speech from highly influential users.

4 Jun 2021

Implementing fully

Under investigation

Facebook should have a process that utilizes regional political and linguistic expertise, with adequate resourcing when evaluating political speech from highly influential users.

4 Jun 2021

Implementing fully

Under investigation

Facebook should publicly explain the rules that it uses when it imposes account-level sanctions against influential users.

4 Jun 2021

Implementing Fully

Under investigation

Facebook should assess the on-and-offline risk of harm before lifting an influential user’s account suspension.

4 Jun 2021

Implementing fully

Under investigation

Facebook should document any exceptional processes that apply to influential users.

4 Jun 2021

Implementing fully

Under investigation

Facebook should more clearly explain its newsworthiness allowance.

4 Jun 2021

Implementing fully

Under investigation

In regard to cross check review for influential users, Facebook should clearly explain the rationale, standards, and processes of review, including the criteria to determine which pages and accounts are selected for inclusion.

4 Jun 2021

Implementing fully

Under investigation

Facebook should report on the relative error rates and thematic consistency of determinations made through the cross check process compared with ordinary enforcement procedures.

4 Jun 2021

No further action

Under investigation

Facebook should review its potential role in the election fraud narrative that sparked violence in the United States on January 6, 2021 and report on its findings.

4 Jun 2021

Implementing in part

Under investigation

Facebook should be clear in its Corporate Human Rights policy how it collects, preserves and shares information related to investigations and potential prosecutions, including how researchers can access that information.

4 Jun 2021

Assessing feasibility

Under investigation

Facebook should explain in its Community Standards and Guidelines its strikes and penalties process for restricting profiles, pages, groups and accounts on Facebook and Instagram in a clear, comprehensive, and accessible manner.

4 Jun 2021

Implementing fully

Under investigation

Facebook should tell users how many violations, strikes, and penalties they have, as well as the consequences of future violations.

4 Jun 2021

Implementing fully

Under investigation

In its transparency reporting, Facebook should include numbers of profile, page, and account restrictions, including the reason and manner in which enforcement action was taken, with information broken down by region and country.

4 Jun 2021

Assessing feasibility

Under investigation

Facebook should develop and publish a policy that governs its response to crises or novel situations where its regular processes would not prevent or avoid imminent harm.

4 Jun 2021

Implementing fully

Under investigation

Facebook should translate its Community Standards and Internal Implementation Standards into Punjabi. Facebook should aim to make its Community Standards accessible in all languages widely spoken by its users.

27 May 2021

Committed to action

Under investigation

The company should restore human review and access to a human appeals process to pre-pandemic levels as soon as possible while fully protecting the health of Facebook’s staff and contractors.

27 May 2021

Committed to action

Under investigation

Facebook should improve its transparency reporting to increase public information on error rates by making this information viewable by country and language for each Community Standard.

27 May 2021

Assessing feasibility

Under investigation

Provide people with additional information regarding the scope and enforcement of restrictions on veiled threats. This would help people understand what content is allowed in this area. Facebook should make their enforcement criteria public. These should consider the intent and identity of the person, as well as their audience and the wider context.

11 Mar 2021

Committed to action

Under investigation

Go beyond the policy that Facebook is enforcing, and add more specifics about what part of the Facebook Community Standards they violated.

25 Feb 2021

Assessing feasibility

Under investigation

Improve automated detection of images with text overlay so that posts raising awareness of breast cancer symptoms are not wrongly flagged for review. Facebook should also improve its transparency reporting on its use of automated enforcement.

25 Feb 2021

Committed to action

Under investigation

Revise the https://salud.go.cr/step1 Instagram Community Guidelines to specify that female nipples can be shown to raise breast cancer awareness and clarify that where there are inconsistencies between the Community Guidelines and the Community Standards, the latter take precedence.

25 Feb 2021

Committed to action

Under investigation

When communicating to people about how they violated policies, be clear about the relationship between the Community Guidelines and Community Standards.

25 Feb 2021

Committed to action

Under investigation

Ensure people can appeal decisions taken by automated systems to human review when their content is found to have violated the policy on adult nudity and sexual activity.

25 Feb 2021

Assessing feasibility

Under investigation

Inform people when automation is used to take enforcement action against their content, including accessible descriptions of what this means.

25 Feb 2021

Assessing feasibility

Under investigation

Expand transparency reporting to disclose data ont the number of automated removal decisions, and the proportion of those decisions subsequently reversed following human review.

25 Feb 2021

Assessing feasibility

Under investigation

Provide a public list of the organizations and individuals designated “dangerous” under the policy on dangerous individuals and organizations..

25 Feb 2021

Assessing feasibility

Under investigation

Clarify the Community Standards with respect to health misinformation, particularly with regard to COVID-19. Facebook should set out a clear and accessible policy on health misinformation, consolidating and clarifying existing policies in one place.

25 Feb 2021

Committed to action

Under investigation

Facebook should 1) publish its range of enforcement options within the Community Standards, ranking these options from most to least intrusive based on how they infringe freedom of expression, 2) explain what factors, including evidence-based criteria, the platform will use in selecting the least intrusive option when enforcing its Community Standards to protect public health and 3) make clear within the Community Standards what enforcement option applies to each policy.

25 Feb 2021

Committed to action

Under investigation

To ensure enforcement measures on health misinformation represent the least intrusive means of protecting public health, Facebook should clarify the particular harms it is seeking to prevent and provide transparency about how it will assess the potential harm of particular content.

25 Feb 2021

Committed to action

Under investigation

To ensure enforcement measures on health misinformation represent the least intrusive means of protecting public health, Facebook should conduct an assessment of its existing range of tools to deal with health misinformation and consider the potential for development of further tools that are less intrusive than content removals.

25 Feb 2021

Committed to action

Under investigation

Publish a transparency report on how the Community Standards have been enforced during the COVID-19 global health crisis.

25 Feb 2021

Committed to action

Under investigation

Conduct a human rights impact assessment with relevant stakeholders as part of its process of rule modification.

25 Feb 2021

Committed to action

Under investigation

In cases where people post information about COVID-19 treatments that contradicts the specific advice of health authorities and where a potential for physical harm is identified but is not imminent, Facebook should adopt a range of less intrusive measures.

25 Feb 2021

No further action

Closed