Policy details

Change log

CHANGE LOG

Change log

Today

Current version

Feb 23, 2023
Sep 29, 2022
Feb 24, 2022
Nov 24, 2021
Sep 30, 2021
Nov 18, 2020
Nov 30, 2018
Jun 29, 2018
Show olderShow fewer

Policy Rationale

Privacy and the protection of personal information are fundamentally important values for Facebook. We work hard to safeguard your personal identity and information and we do not allow people to post personal or confidential information about yourself or of others.

We remove content that shares, offers or solicits personally identifiable information or other private information that could lead to physical or financial harm, including financial, residential, and medical information, as well as private information obtained from illegal sources. We also recognize that private information may become publicly available through news coverage, court filings, press releases, or other sources. When that happens, we may allow the information to be posted.

We also provide people ways to report imagery that they believe to be in violation of their privacy rights.

Do not post:

Content that shares or solicits any of the following private information, either on Facebook or through external links:

Personally identifiable information about yourself or others

  • Personal identity: identifying individuals through government-issued numbers.
    • National identification number (for example Social Security Number (SSN), Passport Number, National Insurance/Health Service Number, Personal Public Service Number (PPS), Individual Taxpayer Identification Number (ITIN)).
    • Government IDs of law enforcement, military, or security personnel.
  • Personal information: directly identifying an individual, by indicating the ID number or registration information and the individual’s name.
    • Records or official documentation of civil registry information (marriage, birth, death, name change or gender recognition, and so on).
    • Immigration and work status documents (for example, green cards, work permits, visas, or immigration papers).
    • Driver’s licenses or license plates, except when license plates are shared to facilitate finding missing people or animals.
    • Credit Privacy Number (CPN).
  • Digital identity: authenticating access to an online identity
    • Email addresses with passwords.
    • Digital identities with passwords.
    • Passwords, pins or codes to access private information.

Other private information

  • Personal contact information of others such as phone numbers, addresses and email addresses, except when shared or solicited to promote charitable causes, find missing people, animals, or objects, or contact business service providers.
  • Financial information.
    • Personal financial information about yourself or others, including:
      • Non-public financial records or statements.
      • Bank account numbers with security or pin codes.
      • Digital payment method information with log in details, security or pin codes.
      • Credit or debit card information with validity dates or security pins or codes.
    • Financial information about businesses or organizations, unless originally shared by the organization itself, including:
      • Financial records or statements except when the financial records of the business are publicly available (for example, listed on stock exchanges or regulatory agencies, and so on)
      • Bank account numbers accompanied by security or pin codes.
      • Digital payment method information accompanied by log in details, security or pin codes.
  • Residential information
    • Imagery that displays the external view of private residences if all of the following conditions apply:
      • The residence is a single-family home, or the resident's unit number is identified in the image/caption.
      • The city/neighborhood or GPS pin (for example, a pin from Google Maps) are identified.
      • The content identifies the resident(s).
      • That same resident objects to the exposure of their private residence, or there is context of organizing protests against the resident (This does not include embassies that also serve as residences).
    • Content that exposes information about safe houses by sharing any of the below, unless the safe house is actively promoting information about their facility
      • Actual address (Note: "Post Box only" is allowed.)
      • Images of the safe house.
      • Identifiable city/neighborhood of the safe house.
      • Information exposing the identity of the safe house residents.
  • Medical information
    • Records or official documentation displaying medical, psychological, biometric, or genetic hereditary of others.
  • Information obtained from hacked sources.
    • Except in limited cases of newsworthiness, content claimed or confirmed to come from a hacked source, regardless of whether the affected person is a public figure or a private individual.

The following content also may be removed:

  • A reported photo or video of people where the person depicted in the image is:
    • A minor under 13 years old, and the content was reported by the minor or a parent or legal guardian.
    • A minor between 13 and 18 years old, and the content was reported by the minor.
    • An adult, where the content was reported by the adult from outside the United States and applicable law may provide rights to removal.
    • Any person who is incapacitated and unable to report the content on their own.

For the following Community Standards, we require additional information and/or context to enforce:

Do not post:

  • Depictions of someone in a medical or health facility if reported by the person pictured or an authorized representative.
  • Source material that purports to reveal nonpublic information relevant to an election shared as part of a foreign government influence operation.
    • We remove reporting on such a leak by state-controlled media entities from the country behind the leak.

In certain cases, we will allow content that may otherwise violate the Community Standards when it is determined that the content is satirical. Content will only be allowed if the violating elements of the content are being satirized or attributed to something or someone else in order to mock or criticize them.

User experiences

See some examples of what enforcement looks like for people on Facebook, such as: what it looks like to report something you don’t think should be on Facebook, to be told you’ve violated our Community Standards and to see a warning screen over certain content.

Note: We’re always improving, so what you see here may be slightly outdated compared to what we currently use.

Reporting
1
Universal entry point

We have an option to report, whether it’s on a post, a comment, a story, a message or something else.

2
Get started

We help people report things that they don’t think should be on our platform.

3
Select a problem

We ask people to tell us more about what’s wrong. This helps us send the report to the right place.

4
Report submitted

After these steps, we submit the report. We also lay out what people should expect next.

Post-report communication
1
Update via notifications

After we’ve reviewed the report, we’ll send the reporting user a notification.

2
More detail in the Support Inbox

We’ll share more details about our review decision in the Support Inbox. We’ll notify people that this information is there and send them a link to it.

3
Appeal option

If people think we got the decision wrong, they can request another review.

4
Post-appeal communication

We’ll send a final response after we’ve re-reviewed the content, again to the Support Inbox.

Takedown experience
1
Immediate notification

When someone posts something that violates our Community Standards, we’ll tell them.

2
Additional context

We’ll also address common misperceptions around enforcement.

3
Explain the policy

We’ll give people easy to understand explanations about why their content was removed.

4
Ask for input

After we’ve established the context for our decision and explained our policy, we’ll ask people what they'd like to do next, including letting us know if they think we made a mistake.

5
Tell us more

If people disagree with the decision, we’ll ask them to tell us more.

6
Set expectations

Here, we set expectations on what will happen next.

Warning screens
1
Warning screens in context

We cover certain content in News Feed and other surfaces, so people can choose whether to see it.

2
More information

In this example, we give more context on why we’ve covered the photo with more context from independent fact-checkers

Enforcement

We have the same policies around the world, for everyone on Facebook.

Review teams

Our global team of over 15,000 reviewers work every day to keep people on Facebook safe.

Stakeholder engagement

Outside experts, academics, NGOs and policymakers help inform the Facebook Community Standards.

Get help with privacy violations

Learn what you can do if you see something on Facebook that goes against our Community Standards.