Jump to content
Our commitments

What does YouTube do to prevent bias?

We work hard to ensure that our systems are not designed to be biased against content belonging to individuals or groups based on political viewpoints or other attributes like gender or sexual orientation. Our platform has always been about sharing information everywhere and giving many different people a voice.

Preventing bias

How does YouTube help ensure that unintended harmful bias is not present in its systems?

We use people across the globe to train our search and discovery systems. The guidelines that they use are publicly available. Our search and recommendation systems are not designed to filter or demote videos or channels based on specific political perspectives.

Additionally, we audit our machine learning systems to help ensure that unintended algorithmic bias such as gender bias isn't present. We correct mistakes when we find them and retrain the systems to be more accurate in the future.

Do YouTube's policies unfairly target certain groups or political viewpoints?

When developing and refreshing our policies, we make sure to hear from a range of different voices, including Creators, subject-area experts, free speech proponents, and policy organisations from across the political spectrum.

Once a policy has been developed, we invest significant time making sure that newly developed policies are consistently enforced by our global team of reviewers, based on objective guidelines. Before any policy is launched, reviewers in a staging environment (where policy decisions aren't actually applied) must consistently make the same decision at a very high rate. If they fail to do so, we revise the training and internal guidelines to ensure clarity and repeat the process. The goal is to reduce subjectivity and personal bias to achieve high accuracy and consistency when operating at scale. Only once we're at an acceptable level of accuracy can the policy be launched to the public.