Policy details

Change log

CHANGE LOG

Change log

Today

Current version

Dec 23, 2022
Apr 29, 2022
Nov 25, 2021
Oct 29, 2021
Aug 27, 2021
May 5, 2021
Jan 29, 2021
Nov 19, 2020
Jun 23, 2020
Dec 29, 2018
Show olderShow fewer

Policy Rationale

We do not allow content or activity that sexually exploits or endangers children. When we become aware of apparent child exploitation, we report it to the National Center for Missing and Exploited Children (NCMEC), in compliance with applicable law. We know that sometimes people share nude images of their own children with good intentions; however, we generally remove these images because of the potential for abuse by others and to help avoid the possibility of other people reusing or misappropriating the images.

We also work with external experts, including the Facebook Safety Advisory Board, to discuss and improve our policies and enforcement around online safety issues, especially with regard to children. Learn more about the technology we’re using to fight against child exploitation.

Do not post:

Child sexual exploitation

Content or activity that threatens, depicts, praises, supports, provides instructions for, makes statements of intent, admits participation in or shares links of the sexual exploitation of children (real or non-real minors, toddlers or babies), including but not limited to:

  • Sexual intercourse
    • Explicit sexual intercourse or oral sex, defined as mouth or genitals entering or in contact with another person's genitals or anus, where at least one person's genitals are nude.
    • Implied sexual intercourse or oral sex, including when contact is imminent or not directly visible.
    • Stimulation of genitals or anus, including when activity is imminent or not directly visible.
    • Presence of by-products of sexual activity.
    • Any of the above involving an animal.
  • Children with sexual elements, including but not limited to:
    • Restraints.
    • Focus on genitals.
    • Presence of aroused adult.
    • Presence of sex toys.
    • Sexualised costume.
    • Stripping.
    • Staged environment (for example, on a bed) or professionally shot (quality/focus/angles).
    • Open-mouth kissing.
  • Content of children in a sexual fetish context.
  • Content that supports, promotes, advocates or encourages participation in pedophilia unless it is discussed neutrally in an academic or verified health context.
  • Content that identifies or mocks alleged victims of child sexual exploitation by name or image.

Solicitation

Content that solicits

  • Child Sexual Abuse Material (CSAM)
  • Nude imagery of children
  • Sexualized imagery of children
  • Real-world sexual encounters with children

Inappropriate interactions with children

Content that constitutes or facilitates inappropriate interactions with children, such as:

  • Arranging or planning real-world sexual encounters with children
  • Purposefully exposing children to sexually explicit language or sexual material
  • Engaging in implicitly sexual conversations in private messages with children
  • Obtaining or requesting sexual material from children in private messages

Exploitative intimate imagery and sextortion

Content that attempts to exploit minors by:

  • Coercing money, favors or intimate imagery with threats to expose intimate imagery or information.
  • Sharing, threatening or stating an intent to share private sexual conversations or intimate imagery.

Sexualization of children

  • Content (including photos, videos, real-world art, digital content, and verbal depictions) that sexualizes children.
  • Groups, Pages and profiles dedicated to sexualizing children.

Child nudity

Content that depicts child nudity where nudity is defined as:

  • Close-ups of children’s genitalia
  • Real nude toddlers, showing:
    • Visible genitalia, even when covered or obscured by transparent clothing.
    • Visible anus and/or fully nude close-up of buttocks.
  • Real nude minors, showing:
    • Visible genitalia (including genitalia obscured only by pubic hair or transparent clothing)
    • Visible anus and/or fully nude close-up of buttocks.
    • Uncovered female nipples.
    • No clothes from neck to knee - even if no genitalia or female nipples are showing.
  • Digitally-created depictions of nude minors, toddlers or babies unless the image is for health or educational purposes.

Non-sexual child abuse

Imagery that depicts non-sexual child abuse regardless of sharing intent

Content that praises, supports, promotes, advocates for, provides instructions for or encourages participation in non-sexual child abuse.

For the following content, we include a warning screen so that people are aware the content may be disturbing and limit the ability to view the content to adults, ages eighteen and older:

  • Videos or photos that depict police officers or military personnel committing non-sexual child abuse.
  • Imagery of non-sexual child abuse, when law enforcement, child protection agencies, or trusted safety partners request that we leave the content on the platform for the express purpose of bringing a child back to safety.

For the following content, we include a sensitivity screen so that people are aware the content may be upsetting to some:

  • Videos or photos of violent immersion of a child in water in the context of religious rituals.

For the following Community Standards, we require additional information and/or context to enforce:

For the following content, we include a warning label so that people are aware that the content may be sensitive:

  • Imagery posted by a news agency that depicts child nudity in the context of famine, genocide, war crimes, or crimes against humanity, unless accompanied by a violating caption or shared in a violating context, in which case the content is removed.

We may also remove imagery depicting the aftermath of non-sexual child abuse when reported by news media partners, NGOs or other trusted safety partners.

User experiences

See some examples of what enforcement looks like for people on Facebook, such as: what it looks like to report something you don’t think should be on Facebook, to be told you’ve violated our Community Standards and to see a warning screen over certain content.

Note: We’re always improving, so what you see here may be slightly outdated compared to what we currently use.

Data
Prevalence

Percentage of times people saw violating content

Content actioned

Number of pieces of violating content we took action on

Proactive rate

Percentage of violating content we found before people reported it

Appealed content

Number of pieces of content people appealed after we took action on it

Restored content

Number of pieces of content we restored after we originally took action on it

Prevalence

Percentage of times people saw violating content

Content actioned

Number of pieces of violating content we took action on

Proactive rate

Percentage of violating content we found before people reported it

Appealed content

Number of pieces of content people appealed after we took action on it

Restored content

Number of pieces of content we restored after we originally took action on it

Reporting
1
Universal entry point

We have an option to report, whether it’s on a post, a comment, a story, a message or something else.

2
Get started

We help people report things that they don’t think should be on our platform.

3
Select a problem

We ask people to tell us more about what’s wrong. This helps us send the report to the right place.

4
Report submitted

After these steps, we submit the report. We also lay out what people should expect next.

Post-report communication
1
Update via notifications

After we’ve reviewed the report, we’ll send the reporting user a notification.

2
More detail in the Support Inbox

We’ll share more details about our review decision in the Support Inbox. We’ll notify people that this information is there and send them a link to it.

3
Appeal option

If people think we got the decision wrong, they can request another review.

4
Post-appeal communication

We’ll send a final response after we’ve re-reviewed the content, again to the Support Inbox.

Takedown experience
1
Immediate notification

When someone posts something that violates our Community Standards, we’ll tell them.

2
Additional context

We’ll also address common misperceptions around enforcement.

3
Explain the policy

We’ll give people easy to understand explanations about why their content was removed.

4
Ask for input

After we’ve established the context for our decision and explained our policy, we’ll ask people what they'd like to do next, including letting us know if they think we made a mistake.

5
Tell us more

If people disagree with the decision, we’ll ask them to tell us more.

6
Set expectations

Here, we set expectations on what will happen next.

Warning screens
1
Warning screens in context

We cover certain content in News Feed and other surfaces, so people can choose whether to see it.

2
More information

In this example, we give more context on why we’ve covered the photo with more context from independent fact-checkers

Enforcement

We have the same policies around the world, for everyone on Facebook.

Review teams

Our global team of over 15,000 reviewers work every day to keep people on Facebook safe.

Stakeholder engagement

Outside experts, academics, NGOs and policymakers help inform the Facebook Community Standards.

Get help with child sexual exploitation, abuse and nudity

Learn what you can do if you see something on Facebook that goes against our Community Standards.