YouTube, in its third quarter, took down more than 58 million videos and 224 million comments based on violations of its policies, according to Reuters.
As per reports, government officials and interest groups in the United States, Europe and Asia have been pressuring YouTube, Facebook and other social media services to quickly identify and remove extremist and hateful content that critics have said, stimulate violence.
ALSO READ: Instagram rolls out voice messaging feature
YouTube began issuing quarterly reports about its enforcement efforts from this year. In the past quarters, most of the removed content was spam, according to YouTube.
As per YouTube, their automated detection tools help the video- sharing platform to quickly identify spam, extremist content and nudity. In September, 90% of the nearly 10,400 videos were removed for violent extremism and 279,600 videos were removed for child safety issues that received less than 10 views.
But when it comes to material that promotes hateful rhetoric and dangerous behaviour, YouTube relies on users to report potentially problematic videos or comments, as the automated detection technologies for those policies are relatively new and less efficient. This also means that the content may be widely viewed before being removed.
YouTube removed about 1.67 million channels and all of the 50.2 million videos that were available from the accounts that have been disabled by Google for either having three policy violations in 90 days or committing what the company found to be an egregious violation, such as uploading child pornography.
According to YouTube, nearly 80% of the channels that were taken down were related to spam uploads, about 13% concerned nudity, and 4.5% had child safety issues. It declined to disclose the overall number of accounts that have uploaded videos, but said that the removals were a small fraction.
In addition, about 7.8 million videos were removed individually for policy violations, in line with the previous quarter.