Jump to content
Progress and impact

Progress on managing harmful content

Our Community Guidelines outline what we allow and don't allow on YouTube. A key part of our commitment to responsibility is enforcing these guidelines and removing policy-violating content. We've put together a few key data points to show you how we're progressing with our responsibility efforts – if you'd like to know more, you can find details on our dedicated Community Guidelines Enforcement Report.

Progress on managing harmful content

Videos removed, by removal reason

YouTube relies on teams around the world to review flagged videos and remove content that violates Community Guidelines; restrict videos (e.g. age-restrict content that may not be appropriate for all audiences); or leave the content live when it doesn't violate our guidelines.

This exhibit shows the volume of videos removed by YouTube by the reason why a video was removed. These removal reasons correspond to YouTube's Community Guidelines. Reviewers evaluate flagged videos against all of our Community Guidelines and policies, regardless of why the video was originally flagged.

April 2021–June 2021

Videos removed, by first source of detection

This chart shows the volume of videos removed by YouTube, by source of first detection (automated flagging or human detection). Flags from human detection can come from a user or a member of YouTube's Trusted Flagger programme. Trusted Flagger programme members include NGOs and government agencies that are particularly effective at notifying YouTube of content that violates our Community Guidelines.

April 2021–June 2021

Videos removed, by views

YouTube strives to prevent content that breaks our rules from being widely viewed – or viewed at all – before it's removed. Automated flagging enables us to act more quickly and accurately to enforce our policies. This chart shows the percentage of video removals that occurred before they received any views versus those that occurred after receiving some views.

April 2021–June 2021