YouTube is once again cleaned up their platform. On Monday, the Google-owned company said it took down and deleted more than 8 million videos between October and December for violation of its community guidelines. The majority of the videos were spam or people trying to upload “adult content.”

YouTube released an statement:

“The majority of these 8 million videos were mostly spam or people attempting to upload adult content – and represent a fraction of a percent of YouTube’s total views during this time period”

The company says they introduced AI to flag inappropriate content on its platform last June, and since then, the amount of videos flagged by the software has gone up exponentially. What has also improved is that the lesser time consuming videos are also deleted , because more than half of the violent videos removed these days garner less than 10 views before facing the axe.

How Videos Are Filtered:

YouTube hires experts to deal with that matter, 80% of the videos or 6.7 million videos to be filtered out, were removed by machines. 76% of these videos were removed before a single view was registered.

Most of the deleted content came from spam networks as they uploaded adult content violating the terms and conditions of the website.

Here’s how those reports broke down by content type:

  • Sexually explicit – 9million (30%)
  • Spam or misleading – 8million (27%)
  • Hateful or abusive – 4.6million (16%)
  • Violent or repulsive – 4million (14%)
  • Harmful or dangerous acts – 2.3million (8%)
  • Child abuse – 1.6million (6%)
  • Promotes terrorism – 491,000 (2%)

Is it for any benefit?

It’s at that time unknown whether the steps taken would be much enough to combat the negativity getting the spotlight in recent times. The site also has had some issue with its biggest content creators having trouble facing adhering to the recommended ethics, which is a situation YouTube still has yet to properly address.