Google-owned video sharing platform YouTube has taken steps to ensure that users do not get misinformed by content on its platform. The examples the company cited include ‘promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11’.
The original blog post from YouTube, published on January 25, said it will reduce recommendations of content that could misinform users in harmful ways.
The video sharing platform also said that it will – reduce the spread of borderline content – that comes close to but doesn’t cross the line of violating YouTube’s guidelines. YouTube also announced that they will ‘pull in recommendations from a wider set of topics’ to make sure the users don’t get too many similar recommendations.
The shift will only affect recommendations of what videos to watch, not whether a video is available on YouTube. People can still access all videos that comply with the Community Guidelines and, when relevant, these videos may appear in recommendations for channel subscribers and in search results.
Conclusion: YouTube’s announcement is a great victory which will save thousands. It’s only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable.
If you see something, say something.
— Guillaume Chaslot (@gchaslot) February 9, 2019
On Saturday, Guillaume Chaslot, a former engineer for Google – YouTube’s parent company – hailed the move as a ‘great victory’.
‘It’s only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable,’ said Chaslot.
The change is another step by YouTube after the ‘hundred of changes’ made in the last year to improve the quality of recommendations for users on YouTube.
At this time, the change will apply to a small subset of videos in the United States, as systems become more accurate, YouTube will further introduce the change in more countries.
This change relies on a combination of machine learning and real people. ‘We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations’, said the blog post.
Click here for Latest News updates and viral videos on our AI-powered smart news genie