How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
For most violations, if you continue to post content that goes against the Facebook Community Standards or Instagram Community Guidelines, despite repeated warnings and restrictions, the Facebook company will disable your account.
After 5 strikes, you may receive additional 30-day restrictions from creating content, or we may remove your account, depending on the severity and frequency of the violations. In some cases, a violation may be severe enough that we’ll disable your account after one occurrence, as in the case of posting child sexual exploitation content.
We’ll also disable some accounts as soon as we become aware of them, such as those of dangerous individuals, convicted sex offenders, accounts created to get around our restrictions, and in instances where people misrepresent their identity.
If your Facebook or Instagram account has been disabled, you'll see a message saying your account is disabled when you try to log in. We also let you know whether you can request another review if you believe we made a mistake.