Corrections and adjustments

UPDATED

JUN 11, 2021

We detail any specific adjustments identified through our information quality practices. We will update this in accordance with our measurement processes.

5/2021: Content actioned for suicide and self-injury on Facebook

In 2020, some of the content we actioned against our policy for violent and graphic content was later found to be violating our policy for suicide and self-injury. We reclassified this content accordingly, which impacted numbers we previously shared for content actioned on Facebook in 2020.

2/2021: Restored content for adult nudity and sexual activity on Facebook, prevalence and content actioned for violent and graphic content on Facebook, content actioned for suicide and self-injury on Facebook and content restored by Instagram

In Q4, we introduced clarifications on certain classes of images for our policy on adult nudity and sexual activity on Facebook. We restored some previously actioned content based on the latest policy, which impacted the numbers we previously shared for restored content on Facebook in Q3.

For violent and graphic content on Facebook, prevalence was previously reported in the Community Standards Enforcement Report for November 2020 as between 0.05% and 0.06% of views. In the February 2021 report, we updated prevalence for violent and graphic content to about 0.07% of views in Q3.

In Q2, some of the content we actioned against our policy for violent and graphic content was later found to be in violation of our specific policy for suicide and self-injury, after we regained some manual review capacity in early September. We reclassified this content accordingly, which impacted numbers we previously shared for content actioned on Facebook in Q3.

Additionally, we adjusted our restored content numbers for Q1 and Q2 on Instagram to account for previously unreported comments we restored. This has resulted in minimal changes across most policy areas on Instagram, and we adjusted previously shared data accordingly. We will continue to update historical numbers as we update our policies and continue to improve our systems and accounting.

11/2020: Updated adjustments to content actioned, proactive rate, content appealed by users and content restored by Facebook and Instagram

In Q3, we made an update that recategorized previously actioned cruel and insensitive content so it is no longer considered hate speech. This update impacted the numbers we had previously shared for content actioned, proactive rate, appealed content and restored content for Q4 2019, Q1 2020 and Q2 2020, and we’ve adjusted the numbers accordingly. We also updated our policy to remove more types of graphic suicide and self-injury content, and recategorized some violent and graphic content we had previously marked as disturbing in Q2.

Additionally, we adjusted our restored content numbers for Q1 and Q2 on Instagram to account for previously unreported comments we restored, in addition to an issue with our data source for the August 2020 report. This has resulted in minimal changes across most policy areas on Facebook and Instagram, and we are adjusting previously shared data accordingly. We will continue to update historical numbers as we update our policies, and continue to improve our systems and accounting.

8/2020: Content actioned for violent and graphic content on Instagram

In Q1 2020, we identified and corrected an issue with the accounting of actions taken by our proactive detection technology for violent and graphic content on Instagram, and we were able to update our full reporting systems in Q2. For violent and graphic content on Instagram, content actioned in Q1 2020 was previously reported in the May 2020 report as 2.3 million pieces of content, and has been updated to 2.8 million in the August 2020 report.

5/2020: Updated adjustments to content actioned, proactive content actioned, content appealed by users, and content restored by Facebook and Instagram

At the time of our last update in November 2019, we made a number of improvements to our systems and accounting. These improvements allowed us to estimate largest impacts while still adjusting our metrics at that time. Following the November 2019 report, we further refined these improvements.

Because of this work, in the fifth edition of the Community Standards Enforcement Report for May 2020, we are adjusting previously shared data. Most categories for 2019 are only minimally impacted, and any adjustments to data amount to no more than a 3% change in content actioned. We will continue to update historical numbers as we reclassify previously removed content for different violations based on existing and changing protocols, and continue to improve our systems and accounting.

11/2019: Content actioned, proactive rate for spam on Facebook

At Facebook, different systems take action on different types of content to improve efficiency and reliability for the billions of actions happening every quarter. One of these systems, which acts mainly on content with links, did not log our actions for certain content that was removed if no one tried to view it within seven days of it being created, even if this content was removed from the platform.

While we know this undercounts the true number of content containing external links, mainly affecting our spam metrics for content containing malicious links, we are not currently able to retrospectively size this undercounting. As such, the numbers currently reflected in the community Standards Enforcement Report represent a minimum estimate of both content actioned and proactive rate for the impacted period. Updates about this issue will be posted here when available.

11/2019: Content actioned, proactive content actioned, content appealed by users, and content restored by Facebook

When we shared the second edition of the Community Standards Enforcement Report in November 2018, we updated our method for counting how we take actions on content. We did this so that the metrics better reflected what happens on Facebook when we take action on content for violating our Community Standards. For example, if we find that a post containing one photo violates our policies, we want our metric to reflect that we took action on one piece of content – not two separate actions for removing the photo and the post.

However, in July 2019, we found that the systems logging and counting these actions did not correctly log the actions taken. This was largely due to needing to count multiple actions that take place within a few milliseconds and not miss, or overstate, any of the individual actions taken. Because our logging system for measurement purposes is distinct from our operations to enforce our policies, the issue with our accounting did not impact how we enforced our policies or how we informed people about those actions; it only impacted how we counted the actions we took. As soon as we discovered this issue, we worked to fix it, identify any incorrect metrics previously shared and establish a more robust set of checks in our processes to ensure the accuracy of our accounting. In total, we found that this issue impacted the numbers we had previously shared for content actioned, proactive rate, appealed content, and restored content for Q3 2018, Q4 2018, and Q1 2019.

The fourth edition of the Community Standards Enforcement Report includes the correct metrics for the affected quarters and the table linked above provides the previously reported metrics and their corrections.