Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
If you post content that goes against the Facebook Community Standards or Instagram Community Guidelines, we’ll remove it and may then apply a strike to your Facebook or Instagram account. Whether we apply a strike depends on the severity of the content, the context in which it was shared and when it was posted.
If you posted this content to a Page or group you manage on Facebook, the strike may also count against that Page or group. If you manage a group, we may also count violations you approve as strikes against that group. In some cases, a violation may be severe enough, such as posting child sexual exploitation content, that we’ll disable your account, Page or group on Facebook, or your account on Instagram, after one occurrence.
To ensure our strike system is fair and proportionate, we won’t count strikes on violating content posted over 90 days ago for most violations or over 4 years ago for more severe violations. We also won’t count strikes for certain policy violations. This includes when someone shares their own financial information, which we remove to prevent fraud, or cases where we have extra context about the nature of the violation.
If we remove multiple pieces of content at once, without notifying you at each removal, we may also count them as a single strike. All strikes on Facebook or Instagram expire after one year.