Policy details

Change log

CHANGE LOG

Change log

Today

Current version

28 Apr 2022
30 Sep 2021
4 May 2020
17 Dec 2020
18 Nov 2020
30 Jul 2020
1 Jul 2019
26 Apr 2019
20 Mar 2019
31 Aug 2018
Show olderShow fewer

Policy rationale

We care deeply about the safety of the people who use our apps. We regularly consult with experts in suicide and self-injury to help inform our policies and enforcement, and work with organisations around the world to provide assistance to people in distress.

While we do not allow people to intentionally or unintentionally celebrate or promote suicide or self-injury, we do allow people to discuss these topics because we want Facebook to be a space where people can share their experiences, raise awareness about these issues, and seek support from one another.

We define self-injury as the intentional and direct injuring of the body, including self-mutilation and eating disorders. We remove any content that encourages suicide or self-injury, including fictional content such as memes or illustrations and any self-injury content that is graphic, regardless of context. We also remove content that identifies and negatively targets victims or survivors of suicide or self-injury seriously, humorously or rhetorically, as well as real-time depictions of suicide or self-injury. Content about recovery of suicide or self-harm that is allowed, but may contain imagery that could be upsetting, such as a healed scar, is placed behind a sensitivity screen.

When people post or search for suicide or self-injury-related content, we will direct them to local organisations that can provide support, and if our Community Operations team is concerned about immediate harm, we will contact local emergency services to get them help. For more information, visit the Facebook Safety Centre.

With respect to live content, experts have told us that if someone is saying they intend to attempt suicide on a live-stream, we should leave the content up for as long as possible, because the longer someone is talking to a camera, the more opportunity there is for a friend or family member to call emergency services.

However, to minimise the risk of others being negatively affected by viewing this content, we will stop the live-stream at the point at which the threat turns into an attempt. As mentioned above, in any case, we will contact the emergency services if we identify that someone is at immediate risk of harming themselves.

Do not post:

Content that promotes, encourages, coordinates or provides instructions for
  • Suicide.
  • Self-injury.
  • Eating disorders.

Content that depicts graphic self-injury imagery

It is against our policies to post content depicting a person who engaged in a suicide attempt or death by suicide

Content that focuses on depiction of ribs, collar bones, thigh gaps, hips, concave stomach or protruding spine or scapula when shared together with terms associated with eating disorders

Content that contains instructions for drastic and unhealthy weight loss when shared together with terms associated with eating disorders.

Content that mocks victims or survivors of suicide, self-injury or eating disorders who are either publicly known or implied to have experienced suicide or self-injury

For the following content, we restrict content to adults over the age of 18, and include a sensitivity screen so that people are aware that the content may be upsetting:

  • Photos or videos depicting a person who engaged in euthanasia/assisted suicide in a medical setting.

For the following content, we include a sensitivity screen so that people are aware that the content may be upsetting to some:

  • Content that depicts older instances of self-harm such as healed cuts or other non-graphic self-injury imagery in a self-injury, suicide or recovery context.
  • Content that depicts ribs, collar bones, thigh gaps, hips, concave stomach or protruding spine or scapula in a recovery context.

We provide resources to people who post written or verbal admissions of engagement in self-injury, including:

  • Suicide.
  • Euthanasia/assisted suicide.
  • Self-harm.
  • Eating disorders.
  • Vague, potentially suicidal statements or references (including memes or stock imagery about sad mood or depression) in a suicide or self-injury context.

For the following Community Standards, we require additional information and/or context to enforce:

  • We may remove suicide notes when we have confirmation of a suicide or suicide attempt. We try to identify suicide notes using several factors, including, but not limited to, family or legal representative requests, media reports, law enforcement reports or other third-party sources (e.g. government agencies, NGOs).

User experiences

See some examples of what enforcement looks like for people on Facebook, such as: what it looks like to report something that you don't think should be on Facebook, to be told that you've violated our Community Standards and to see a warning screen over certain content.

Note: We're always improving, so what you see here may be slightly outdated compared to what we currently use.

Data
Prevalence

Percentage of times that people saw violating content

Content actioned

Number of pieces of violating content that we took action on

Proactive rate

Percentage of violating content that we found before people reported it

Appealed content

Number of pieces of content that people appealed after we took action on it

Restored content

Number of pieces of content that we restored after we originally took action on it

Prevalence

Percentage of times that people saw violating content

Content actioned

Number of pieces of violating content that we took action on

Proactive rate

Percentage of violating content that we found before people reported it

Appealed content

Number of pieces of content that people appealed after we took action on it

Restored content

Number of pieces of content that we restored after we originally took action on it

Reporting
1
Universal entry point

We have an option to report, whether it's on a post, a comment, a story, a message or something else.

2
Getting started

We help people report things that they don't think should be on our platform.

3
Select a problem

We ask people to tell us more about what's wrong. This helps us send the report to the right place.

4
Report submitted

After these steps, we submit the report. We also lay out what people should expect next.

Post-report communication
1
Update via notifications

After we've reviewed the report, we'll send the reporting user a notification.

2
More detail in the Support Inbox

We'll share more details about our review decision in the Support Inbox. We'll notify people that this information is there and send them a link to it.

3
Appeal option

If people think we made the wrong decision, they can request another review.

4
Post-appeal communication

We'll send a final response after we've re-reviewed the content, again to the Support Inbox.

Takedown experience
1
Immediate notification

When someone posts something that violates our Community Standards, we'll tell them.

2
Additional context

We'll also address common misperceptions around enforcement.

3
Explain the policy

We'll give people easy-to-understand explanations about why their content was removed.

4
Ask for input

After we've established the context for our decision and explained our policy, we'll ask people what they'd like to do next, including letting us know if they think we made a mistake.

5
Tell us more

If people disagree with the decision, we'll ask them to tell us more.

6
Set expectations

Here, we set expectations on what will happen next.

Warning screens
1
Warning screens in context

We cover certain content in News Feed and other surfaces, so people can choose whether to see it.

2
More information

In this example, we give more context on why we've covered the photo with more context from independent fact-checkers

Enforcement

We have the same policies around the world, for everyone on Facebook.

Review teams

Our global team of over 15,000 reviewers work every day to keep people on Facebook safe.

Stakeholder engagement

Outside experts, academics, NGOs and policymakers help inform the Facebook Community Standards.

Get help with suicide and self-injury

Learn what you can do if you see something on Facebook that goes against our Community Standards.