Providing context on sensitive or misleading content

UPDATED

APR 2, 2024

One way Meta promotes a safe, authentic community is by informing people that content might be sensitive or misleading, even if it doesn’t explicitly violate the Facebook Community Standards or Instagram Community Guidelines. In this instance, we’ll include additional context about the content to help people decide what to read, trust or share.

How we provide context on content

By providing people with specific and relevant context when they come across a flagged post, we can help them be more informed about what they see and read. Here are some ways we provide context on relevant pieces of content that may be sensitive or misleading:

Warning screens on sensitive content on Facebook, Instagram and Threads

Our goal is to protect people from viewing potentially sensitive content.

Facebook

People value the ability to discuss important and often difficult issues online, but they also have different sensitivities to certain kinds of content. Therefore, we include a warning screen over potentially sensitive content on Facebook, such as:

  • Violent or graphic imagery.

  • Posts that contain descriptions of bullying or harassment, if shared to raise awareness.

  • Some forms of nudity.

  • Posts related to suicide or suicide attempts.

Warning screens
1
Warning screens in context

We cover certain content in News Feed and other surfaces, so people can choose whether to see it.

2
More information

In this example, we give more context on why we’ve covered the photo with more context from independent fact-checkers

Instagram

To help people avoid coming across content they’d rather not see, we limit the visibility of certain posts that are flagged by people on Instagram for containing sensitive or graphic material. Photos and videos containing such content will appear with a warning screen to inform people about the content before they view it. This warning screen appears when viewing a post in feed or on someone's profile.

Verified badges on Facebook, Instagram, Messenger and Threads

Our goal is to help people feel confident about the content and accounts they interact with.

To combat impersonations and help people avoid scammers that pretend to be high-profile people, Meta provides verified badges on Pages and profiles that indicate a verified account. This means we’ve confirmed the authentic presence of the public figure, celebrity or global brand that the account represents.

Notification screens on outdated articles on the Facebook app

Our goal is to make it easier for people to identify content that’s timely, reliable and most valuable to them.

To give people more context about a news article before they share it on Facebook, Meta includes a notification screen if the article is more than 90 days old. After which, we allow people to continue sharing it if they desire. This notification helps people understand how old a given news article is, and its source.

To ensure we don’t slow the spread of credible information, especially in the health space, content posted by government health authorities and recognized global health organizations does not have this notification screen.

Warning screens on sensitive content on Facebook, Instagram and Threads
Verified badges on Facebook, Instagram, Messenger and Threads
Notification screens on outdated articles on the Facebook app

Our goal is to protect people from viewing potentially sensitive content.

Facebook

People value the ability to discuss important and often difficult issues online, but they also have different sensitivities to certain kinds of content. Therefore, we include a warning screen over potentially sensitive content on Facebook, such as:

  • Violent or graphic imagery.

  • Posts that contain descriptions of bullying or harassment, if shared to raise awareness.

  • Some forms of nudity.

  • Posts related to suicide or suicide attempts.

Warning screens
1
Warning screens in context

We cover certain content in News Feed and other surfaces, so people can choose whether to see it.

2
More information

In this example, we give more context on why we’ve covered the photo with more context from independent fact-checkers

Instagram

To help people avoid coming across content they’d rather not see, we limit the visibility of certain posts that are flagged by people on Instagram for containing sensitive or graphic material. Photos and videos containing such content will appear with a warning screen to inform people about the content before they view it. This warning screen appears when viewing a post in feed or on someone's profile.

Our goal is to help people feel confident about the content and accounts they interact with.

To combat impersonations and help people avoid scammers that pretend to be high-profile people, Meta provides verified badges on Pages and profiles that indicate a verified account. This means we’ve confirmed the authentic presence of the public figure, celebrity or global brand that the account represents.

Our goal is to make it easier for people to identify content that’s timely, reliable and most valuable to them.

To give people more context about a news article before they share it on Facebook, Meta includes a notification screen if the article is more than 90 days old. After which, we allow people to continue sharing it if they desire. This notification helps people understand how old a given news article is, and its source.

To ensure we don’t slow the spread of credible information, especially in the health space, content posted by government health authorities and recognized global health organizations does not have this notification screen.