Policies that outline what is and isn't allowed on the Facebook app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
This page summarizes the different types of signals we use to rank content in Feed. To go directly to our Content Distribution Guidelines, click here.
Meta can connect you to who and what matters most: your people, your interests, and your world together in one place. Our goal is to make sure you see the content that is most valuable to you, whether you’re browsing Feed, using Search to look for something specific, connecting with your community in Groups, searching for a deal in Marketplace, or watching videos in Watch. To do this, we arrange all of the content you could see on each of these surfaces with the aim of showing you the things we think you may be most personally interested in at the top of each surface.
This means that when you encounter content across Facebook, the way it appears is often personalized to you. Every friend you connect with, Group you join, Page you like, comment you make, or response you leave on Feed surveys such as, “Is this post worth your time?” is an input you give us as to what is most meaningful to you, and the algorithms we use determine what to show you by predicting what you’re most likely to be interested in based on all of this activity. This personalization makes you an active participant in the experience.
Like elsewhere across Facebook, the posts you see in Feed are ranked based on what we believe will be most valuable to you. Feed is largely comprised of posts from the friends, Groups, and Pages you’ve chosen to connect with. Because most people have more content in their Feed than they could possibly browse in one session, we use an algorithm to determine the order of all of the posts you could see. To prioritize the most meaningful posts at the top of your Feed, the algorithm works in four steps:
The first item the algorithm considers is your inventory, or the total set of posts you could see when you open Facebook. This includes all the posts shared by the people you have connected to as ‘friends’, the Pages you follow and the Groups you have joined, interspersed with ads and recommended content we think will be relevant to you based on your Facebook activity.
Then, for each of these posts, the algorithm considers multiple factors such as who posted it; how you have previously interacted with that person; whether it’s a photo, a video, a link; and how popular the post is based on things like how many of your Friends liked it, Pages that re-shared it, etc. All of these factors are called signals.
From there, the algorithm uses these signals to make a series of personalized predictions about each post based on how likely it is to be relevant to you: for example, whether it’s from your friends or family, how likely you might be to comment on it, how likely it is to foster a meaningful interaction, how likely you might be to find it on your own, or if it contains a quality indicator (if a piece of news is original content the algorithm assigns it a higher personalized relevance score, and it will often show up closer to the top of your Feed). We also run a number of surveys asking people whether a post was "worth your time," and based on those survey responses, we predict how likely people are to find a post worthwhile. Posts that are predicted to be more worthwhile are shown higher up in Feed.
Lastly, the algorithm calculates a relevance score for each post in your inventory based on these signals and predictions. Posts with higher scores are more likely to be interesting to you, so they’ll be placed closer to the top of your Feed, and posts with lower scores will be closer to the bottom.
This video visualizes how this process generally works.For more about how we use ranking to predict what might be most valuable to you in Feed, see How Does Feed Predict What You Want to See?
Conversely, there are some kinds of posts that people have told us they don’t want to see, or are broadly understood to be harmful-- so we strive to remove content from Facebook altogether when it poses a real risk of harm, like graphic violence, hate speech or fake COVID cures. This is why Facebook has Community Standards that prohibit harmful content, and why we invest heavily in developing ways of identifying and acting on it quickly. We also use Feed ranking to reduce the distribution of posts that may contain content that people find objectionable, but don’t necessarily meet the bar of removal under our policies. If a post is likely to contain misinformation, a sensationalized health claim, or clickbait, for example, it will receive a lower value score and appear lower in Feed as a result. We monitor for new types of problematic content like these and develop new algorithmic levers to detect and enforce against them. However, we are confronted by the challenges that come with a platform used by an ever-growing population. As such, our teams work hard every day to adapt and address any gaps that may arise in a thoughtful and deliberate way.
For more about how we use ranking to reduce the distribution of problematic content, see Remove, Reduce, Inform: New Steps to Manage Problematic Content
Our Content Distribution Guidelines outline some of the most significant reasons why content receives reduced distribution in Feed. As these guidelines develop, we will continue to provide transparency about how we define and treat problematic or low quality content.
Our efforts to reduce problematic content in Feed are rooted in our commitment to the values of Responding to People’s Direct Feedback, Incentivizing Publishers to Invest in High-Quality Content, and Fostering a Safer Community. We want people to be able to enjoy and share content about the issues that matter to them without being disrupted by problematic or low quality content.
The reduced distribution may vary depending on the severity of the content, the number of times the poster or commenter has violated our rules previously, and the degree of confidence of our artificial intelligence systems' predictions, among other things. In certain cases, in addition to reducing the distribution of content, there may be additional consequences such as a Page losing access to certain features (like the ability to run advertisements).
While the majority of our reduced distribution efforts are applied around the world equally, we also recognize that in certain situations we cannot always take a one-size-fits-all approach to enforcement. We apply personalized demotions to some types of content, such as content borderline to our policies, for people we predict do not want to see this content or who tell us they do not want to see it. We may also temporarily adjust our enforcement efforts in a specific region or during a critical event. For instance, we might adjust our enforcements to keep people safe when offline harm may be involved, or during election periods, in periods of social unrest, or as we did to help combat the COVID-19 pandemic.
See our Content Distribution Guidelines here.
Beyond helping you connect with people you already know, Facebook can help you discover interesting content from around the world. To do this, in Feed we’ll occasionally suggest new videos, photos, or articles from Pages and Groups that you don’t already follow, but think you may be interested in.
While we know these suggestions help you explore new content, we maintain stricter standards around what content we recommend because you have not chosen to follow these accounts on Facebook. Last year, we shared our Recommendations Guidelines to help ensure we provide a safe and positive experience when suggesting new content to you on Facebook.
For more about the ranking guardrails we use when recommending new content to people, see our Recommendations Guidelines.
We believe it's important that you feel in control of how you use Facebook-- so while we do our best to rank content in a way that we think prioritizes the items that will be most valuable to you, we’ve also built controls that help you customize your experience.
On Home Feed, people have the option to customize the way content is ranked via Control Panel, or you can use Feed preferences to manage what you see there. We also offer a dedicated Feeds tab where people can choose to focus on the posts that matter most to them. Feeds is a new destination giving people more control by enabling them to filter content from friends, Favorites, groups and Pages that they are following and connected to. Additionally, posts on the Feeds tab are shown in reverse chronological order, so you can see the content from your friends, Favorites, groups, and Pages in the order they were posted. You can also use Ad Preferences settings to adjust the ads you see on Facebook and even turn off political ads.
We also provide a tool called Why Am I Seeing This?, which accompanies posts in Feed. Why Am I Seeing This?, gives greater context into how your interactions on Facebook influence what shows up in your Feed and enables you to take action to further personalize what you see by providing easy access to your Feed Preferences.
For more about the controls we offer, see More Control and Context in Feed.