Policies that outline what is and isn't allowed on the Facebook app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others and more.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
Quarterly report on how well we're doing at enforcing our policies in the Facebook app and on Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional Internet restrictions that limit people's ability to access the Internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
This page summarises the different types of signals we use to rank content in Feed. To go directly to our content distribution guidelines, click here.
Meta can connect you to who and what matters most: your people, your interests and your world together in one place. Our goal is to make sure that you see the content that is most valuable to you, whether you're browsing Feed, using Search to look for something specific, connecting with your community in groups, searching for a deal in Marketplace or watching videos in Watch. To do this, we arrange all of the content that you could see on each of these surfaces with the aim of showing you the things that we think you may be most personally interested in at the top of each surface.
This means that when you encounter content across Facebook, the way it appears is often personalised to you. Every friend you connect with, group you join, Page you like, comment you make or response you leave on Feed surveys such as, "Is this post worth your time?" is an input that you give us as to what is most meaningful to you, and the algorithms we use determine what to show you by predicting what you're most likely to be interested in based on all of this activity. This personalisation makes you an active participant in the experience.
Like elsewhere across Facebook, the posts that you see in Feed are ranked based on what we believe will be most valuable to you. Feed is largely comprised of posts from the friends, groups and Pages you've chosen to connect with. Because most people have more content in their Feed than they could possibly browse in one session, we use an algorithm to determine the order of all of the posts you could see. To prioritise the most meaningful posts at the top of your Feed, the algorithm works in four steps:
The first item the algorithm considers is your inventory, or the total set of posts you could see when you open Facebook. This includes all of the posts shared by the people you have connected to as 'friends', the Pages that you follow and the groups that you have joined, interspersed with ads and recommended content that we think will be relevant to you based on your Facebook activity.
Then, for each of these posts, the algorithm considers multiple factors such as who posted it, how you have previously interacted with that person, whether it's a photo, a video or a link, and how popular the post is based on things such as how many of your Friends liked it and Pages that re-shared it. All of these factors are called signals.
From there, the algorithm uses these signals to make a series of personalised predictions about each post based on how likely it is to be relevant to you: for example, whether it's from your friends or family, how likely you might be to comment on it, how likely it is to foster a meaningful interaction, how likely you might be to find it on your own or if it contains a quality indicator (if a piece of news is original content, the algorithm assigns it a higher personalised relevance score, and it will often appear closer to the top of your Feed). We also run a number of surveys asking people whether a post was "worth your time", and based on those survey responses, we predict how likely people are to find a post worthwhile. Posts that are predicted to be more worthwhile are shown higher up in Feed.
Lastly, the algorithm calculates a relevance score for each post in your inventory based on these signals and predictions. Posts with higher scores are more likely to be interesting to you, so they'll be placed closer to the top of your Feed, and posts with lower scores will be closer to the bottom.
This video visualises how this process generally works.For more about how we use ranking to predict what might be most valuable to you in Feed, see How does Feed predict what you want to see?
Conversely, there are some kinds of posts that people have told us they don't want to see or are broadly understood to be harmful, so we strive to remove content from Facebook altogether when it poses a real risk of harm, such as graphic violence, hate speech or fake COVID cures. This is why Facebook has Community Standards that prohibit harmful content, and why we invest heavily in developing ways of identifying and acting on it quickly. We also use Feed ranking to reduce the distribution of posts that may contain content that people find objectionable, but don't necessarily meet the bar of removal under our policies. If a post is likely to contain misinformation, a sensationalised health claim or clickbait, for example, it will receive a lower value score and appear lower in Feed as a result. We monitor for new types of problematic content such as these, and develop new algorithmic levers to detect and enforce against them. However, we are confronted by the challenges that come with a platform used by an ever-growing population. As such, our teams work hard every day to adapt and address any gaps that may arise in a thoughtful and deliberate way.
For more information about how we use ranking to reduce the distribution of problematic content, see Remove, reduce, inform: New steps to manage problematic content
Our Content Distribution Guidelines outline some of the most significant reasons why content receives reduced distribution in Feed. As these guidelines develop, we will continue to provide transparency about how we define and treat problematic or low-quality content.
Our efforts to reduce problematic content in News are rooted in our commitment to the values of Responding to people's direct feedback, Incentivising publishers to invest in high-quality content and Fostering a safer community. We want people to be able to enjoy and share content about the issues that matter to them without being disrupted by problematic or low-quality content.
The reduced distribution may vary depending on the severity of the content, the number of times the poster or commenter has violated our rules previously, and the degree of confidence of our artificial intelligence systems' predictions, among other things. In certain cases, in addition to reducing the distribution of content, there may be additional consequences, such as a Page losing access to certain features (e.g. the ability to run ads).
While the majority of our reduced distribution efforts are applied around the world equally, we also recognise that in certain situations we cannot always take a one-size-fits-all approach to enforcement. See the "Why we use personalised ranking" section above for more information. We may also temporarily adjust our enforcement efforts in a specific region or during a critical event. For instance, we might adjust our enforcements in countries in conflict in order to keep people safe when offline harm may be involved, or during election periods, in periods of social unrest or as we did to help combat the COVID-19 pandemic.
See our Content Distribution Guidelines here.
Beyond helping you connect with people you already know, Facebook can help you discover interesting content from around the world. To do this, in Feed we'll occasionally suggest new videos, photos or articles from Pages and groups that you don't already follow, but we think you may be interested in.
While we know that these suggestions help you explore new content, we maintain stricter standards around what content we recommend because you have not chosen to follow these accounts on Facebook. Last year, we shared our Recommendations Guidelines to help ensure that we provide a safe and positive experience when suggesting new content to you on Facebook.
For more information about the ranking guardrails we use when recommending new content to people, see our Recommendations Guidelines.
We believe it's important that you feel in control of how you use Facebook – so while we do our best to rank content in a way that we think prioritises the items that will be most valuable to you, we've also built controls that help you customise your experience.
On Home Feed, people have the option to customise the way that content is ranked via Control Panel, or you can use Feed preferences to manage what you see there. We also offer a dedicated Feeds tab where people can choose to focus on the posts that matter most to them. Feeds is a new destination giving people more control by enabling them to filter content from friends, Favourites, groups and Pages that they are following and connected to. Additionally, posts on the Feeds tab are shown in reverse chronological order, so you can see the content from your friends, Favourites, groups and Pages in the order they were posted. You can also use ad preferences settings to adjust the ads that you see on Facebook and even turn off political ads.
We also provide a tool called Why am I seeing this?, which accompanies posts in Feed. Why am I seeing this?, gives greater context into how your interactions on Facebook influence what appears in your Feed and enables you to take action to further personalise what you see by providing easy access to your Feed preferences.
For more about the controls we offer, see More control and context in Feed.