Policies that outline what is and isn't allowed on the Facebook app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others and more.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
Quarterly report on how well we're doing at enforcing our policies in the Facebook app and on Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional Internet restrictions that limit people's ability to access the Internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
JAN 19, 2022
We are continually assessing our metrics to learn how we can improve the ways we measure in our Community Standards enforcement report.
We also continue to review our policies and processes and the methodologies behind them. Changes to any of these inherently change the metrics calculations themselves. These methodology or process changes may be in addition to trends indicating that we're getting better or worse at mitigating violations.
As our measurement processes mature, we regularly review and validate our metrics. We have also established a set of standards that govern how we identify, correct and publicly report any adjustments to previously released data.
We identify potential issues with our data using a range of regular quality checks on our data sets, measurement tools and logging systems. When a potential issue has been identified, relevant teams at Meta will undergo a series of steps to investigate, mitigate and identify long-term fixes for the issue.
Once the issue has been addressed, Meta will update data in the Community Standards enforcement report. Where such corrections are meaningful, Meta will describe the issue, metrics affected and the time periods affected.Corrections and adjustments
We are committed to transparently sharing our metrics as well as the processes that we use to calculate and improve them. To streamline and better govern the release of adjustments and corrections to our methodologies and metrics, we developed an information quality procedure to identify, rectify and publicly report any adjustments that we make to previously released information. This is a common practice in large statistical agencies and federal agency public reports and was developed in line with data reporting best practices in both public and private sectors. These reviews and procedures that we developed will be critical in maintaining the accuracy and integrity of our reporting going forwards.
We constantly evaluate and validate our metrics and make sure that the information we are sharing is accurate and our methodologies to generate this data are sound. As part of this work, when we update our methodologies or adjust metrics, we'll share those changes here.
We're constantly refining our processes and methodologies in order to provide the most meaningful and accurate numbers on how we're enforcing our policies. Over the summer of 2019, we implemented information quality processes that create further checks and balances in order to make sure that we share valid and consistent metrics.
We identify different dimensions of each metric and develop a risk-prioritisation of segments that may significantly affect the metrics. For the segments in this prioritised list, we implement multiple checks to make sure that these segments are capturing information accurately.
For example, we break down our content-actioned metrics into multiple dimensions to review. For example, we separate out content based on whether our automated systems or human reviewers took the action, what led us to take action and what type of content (photos, text, video) we took action on.
With these different dimensions, we then assess how much bias would be introduced into our measurement if that dimension was not correctly represented in the metric (for example, if we didn't include video content in our metrics). These assessments allow us to identify dimensions that might affect the metric (such as whether humans took action).
Then, we work out how much the metric could be affected if that dimension was wrong (e.g. if we didn't log any of the content humans took action on). We then prioritise the biggest risk scenarios to do additional cross-checks. For these high-risk combinations, we develop additional tracking and cross-check systems to ensure that these metrics are estimated correctly.
We have also implemented consistency checks to add more validation for our metrics. These include the following:
We periodically measure our actions with a separate, independent system that measures content actions. On a regular basis, we check these various independent metrics, which are intended to identify large errors in our accounting.
We conduct a range of random spot checks to verify the accuracy of our measurement systems in near real time. This includes checking various outcomes that happen later in our system to double-check upstream outcomes. For example, we confirm that content that is appealed is also logged as content that has been actioned, as content must be actioned in order to be appealed. Many of these checks are intended to identify large errors such as content that is appealed but was never removed.
As with all aspects of our standards enforcement reporting, we will continue to evolve and improve our validity and consistency review processes over time.
We also established procedures to identify and correct information previously shared in our enforcement report, which we will regularly review and update. When we identify potential issues in metrics shared in the Community Standards enforcement report, we follow these steps:
Reporting. If a potential issue is discovered, our teams immediately file an incident report that alerts the relevant teams to begin investigating the issue.
Investigating and mitigating. The relevant teams review the potential issue, making immediate changes to prevent further consistency issues where necessary and developing solutions to avoid the issue in the future
Sizing the issue. The relevant teams review the potential issue, making immediate changes to prevent further consistency issues where necessary and developing solutions to avoid the issue in the future.
Post-mortem incident review. Once the issue has been mitigated, we will conduct a detailed internal review to identify the root causes and full impact of the issue. This allows us to identify broader risks to the validity of our measurement so that we can prevent or minimise them.
Once we've identified an issue and adjusted the affected metric, we will publicly report the correction by updating this post at the time of the subsequent release of the Community Standards enforcement report. In such an update, we will describe the issue, the metrics affected and the time periods affected. The data for the quarters previously affected in the Content Standards enforcement report itself will include any adjusted metrics when feasible to ensure that comparisons over time are meaningful.
In addition to the work we do internally to evaluate and improve our metrics, we also look for external input on our methodologies and expand the metrics that we report on to give a more robust picture of how we're doing at enforcing our policies.
To ensure that our methods are transparent and based on sound principles, we seek out analysis and input from subject matter experts on areas such as whether the metrics we provide are informative.
In order to ensure that our approach to measuring content enforcement was meaningful and accurate, we worked with the Data Transparency Advisory Group (DTAG), an external group of international academic experts in measurement, statistics, criminology and governance. In May 2019, they provided their independent, public assessment of whether the metrics we share in the Community Standards enforcement report provide accurate and meaningful measures of how we enforce our policies, as well as the challenges we face in this work, and what we do to address them. Overall, they found our metrics to be reasonable ways of measuring violations and in line with best practices. They also provided a number of recommendations for how we can continue to be more transparent about our work, which we discussed in detail and continue to explore. In addition to this, Meta has committed to an independent audit of the metrics shared in this report in 2021.