Policy details

Change log

CHANGE LOG

Change log

Today

Current version

Dec 29, 2023
Apr 28, 2022
Sep 30, 2021
May 4, 2020
Dec 17, 2020
Nov 18, 2020
Jul 30, 2020
Jul 1, 2019
Apr 26, 2019
Mar 20, 2019
Aug 31, 2018
Show olderShow fewer

Policy Rationale

We care deeply about the safety of the people who use our apps. We regularly consult with experts in suicide, self-injury and eating disorders to help inform our policies and enforcement, and work with organizations around the world to provide assistance to people in distress.

While we do not allow people to intentionally or unintentionally celebrate or promote suicide, self-injury, or eating disorders, we do allow people to discuss these topics because we want Facebook to be a space where people can share their experiences, raise awareness about these issues, and seek support from one another.

We remove any content that encourages suicide, self-injury, or eating disorders, including fictional content such as memes or illustrations and any self-injury content which is graphic, regardless of context. We also remove content that mocks victims or surivors of suicide, self-injury or eating disorders, as well as real time depictions of suicide or self-injury. Content about recovery from suicide, self-injury, or eating disorders that is allowed, but may contain imagery that could be upsetting, such as a healed scar, is placed behind a sensitivity screen.

When people post or search for suicide, self-injury, or eating disorders related content, we will direct them to local organizations that can provide support and if our Community Operations team is concerned about immediate harm we will contact local emergency services to get them help. For more information, visit the Facebook Safety Center.

With respect to live content, experts have told us that if someone is saying they intend to attempt suicide on a livestream, we should leave the content up for as long as possible, because the longer someone is talking to a camera, the more opportunity there is for a friend or family member to call emergency services.

However, to minimize the risk of others being negatively impacted by viewing this content, we will stop the livestream at the point at which the threat turns into an attempt. As mentioned above, in any case, we will contact emergency services if we identify someone is at immediate risk of harming themselves.

Do not post:

Content that promotes, encourages, coordinates, or provides instructions for
  • Suicide.
  • Self-injury.
  • Eating disorders.

Content that depicts graphic self-injury imagery

It is against our policies to post content depicting a person who engaged in a suicide attempt or death by suicide

Content that focuses on depiction of ribs, collar bones, thigh gaps, hips, concave stomach, or protruding spine or scapula when shared together with terms associated with eating disorders

Content that contains instructions for drastic and unhealthy weight loss when shared together with terms associated with eating disorders.

Content that mocks victims or survivors of suicide, self-injury or eating disorders who are either publicly known or implied to have experienced suicide or self-injury

For the following content, we restrict content to adults over the age of 18, and include a sensitivity screen so that people are aware the content may be upsetting:

  • Photos or videos depicting a person who engaged in euthanasia/assisted suicide in a medical setting.

For the following content, we include a sensitivity screen so that people are aware the content may be upsetting to some:

  • Content that depicts older instances of self-harm such as healed cuts or other non-graphic self-injury imagery in a self-injury, suicide or recovery context.
  • Content that depicts ribs, collar bones, thigh gaps, hips, concave stomach, or protruding spine or scapula in a recovery context.

We provide resources to people who post written or verbal admissions of engagement in self injury, including:

  • Suicide.
  • Euthanasia/assisted suicide.
  • Self-harm.
  • Eating disorders.
  • Vague, potentially suicidal statements or references (including memes or stock imagery about sad mood or depression) in a suicide or self-injury context.

For the following Community Standards, we require additional information and/or context to enforce:

  • We may remove suicide notes when we have confirmation of a suicide or suicide attempt. We try to identify suicide notes using several factors, including but not limited to, family or legal representative requests, media reports, law enforcement reports or other third party sources (e.g. government agencies, NGOs).

User experiences

See some examples of what enforcement looks like for people on Facebook, such as: what it looks like to report something you don’t think should be on Facebook, to be told you’ve violated our Community Standards and to see a warning screen over certain content.

Note: We’re always improving, so what you see here may be slightly outdated compared to what we currently use.

Data
Prevalence

Percentage of times people saw violating content

Content actioned

Number of pieces of violating content we took action on

Proactive rate

Percentage of violating content we found before people reported it

Appealed content

Number of pieces of content people appealed after we took action on it

Restored content

Number of pieces of content we restored after we originally took action on it

Prevalence

Percentage of times people saw violating content

Content actioned

Number of pieces of violating content we took action on

Proactive rate

Percentage of violating content we found before people reported it

Appealed content

Number of pieces of content people appealed after we took action on it

Restored content

Number of pieces of content we restored after we originally took action on it

Reporting
1
Universal entry point

We have an option to report, whether it’s on a post, a comment, a story, a message or something else.

2
Get started

We help people report things that they don’t think should be on our platform.

3
Select a problem

We ask people to tell us more about what’s wrong. This helps us send the report to the right place.

4
Report submitted

After these steps, we submit the report. We also lay out what people should expect next.

Post-report communication
1
Update via notifications

After we’ve reviewed the report, we’ll send the reporting user a notification.

2
More detail in the Support Inbox

We’ll share more details about our review decision in the Support Inbox. We’ll notify people that this information is there and send them a link to it.

3
Appeal option

If people think we got the decision wrong, they can request another review.

4
Post-appeal communication

We’ll send a final response after we’ve re-reviewed the content, again to the Support Inbox.

Takedown experience
1
Immediate notification

When someone posts something that doesn't follow our rules, we’ll tell them.

2
Additional context

We’ll also address common misperceptions and explain why we made the decision to enforce.

3
Policy Explanation

We’ll give people easy-to-understand explanations about the relevant rule.

4
Option for review

If people disagree with the decision, they can ask for another review and provide more information.

5
Final decision

We set expectations about what will happen after the review has been submitted.

Warning screens
1
Warning screens in context

We cover certain content in News Feed and other surfaces, so people can choose whether to see it.

2
More information

In this example, we give more context on why we’ve covered the photo with more context from independent fact-checkers

Enforcement

We have the same policies around the world, for everyone on Facebook.

Review teams

Our global team of over 15,000 reviewers work every day to keep people on Facebook safe.

Stakeholder engagement

Outside experts, academics, NGOs and policymakers help inform the Facebook Community Standards.

Get help with suicide and self-injury

Learn what you can do if you see something on Facebook that goes against our Community Standards.