The algorithm YouTube uses recommends videos that don’t follow the company’s guidelines Marvin Tolentino/Alamy
YouTube鈥檚 algorithm recommends videos that violate the company鈥檚 own policies on inappropriate content, according to a crowdsourced .
Not-for-profit company Mozilla asked users of its Firefox web browser to install a browser extension called RegretsReporter, which tracked the YouTube videos they watched, and asked them whether they regretted watching each video.
Between July 2020 and May 2021, 37,380 users flagged 3362 videos they viewed as regrettable 鈥 a fraction of 1 per cent of all those they watched. Reports of these videos were highest in Brazil, with about 22 videos out of every 10,000 viewed being logged as regrettable.
Advertisement
Researchers then watched the reported videos and checked them against YouTube鈥檚 content guidelines; they found that 12.2 per cent of the reported videos either shouldn鈥檛 be on YouTube, or shouldn鈥檛 be recommended through its algorithm, say the Mozilla researchers.
About a fifth of the reported videos would fall under what YouTube鈥檚 rules classify as misinformation, and a further 12 per cent spread covid-19 misinformation, say the researchers. Other issues flagged in the survey included violent or graphic content and hate speech.
鈥淪ome of our findings, if scaled up to the size of YouTube鈥檚 user base, would raise significant questions and be really concerning,鈥 says Brandi Geurkink at Mozilla in Germany. 鈥淲hat we鈥檝e found is the tip of the iceberg.鈥
Most of the contentious videos were delivered to users through YouTube鈥檚 algorithm, which recommends videos from channels that a user may not necessarily follow or hasn鈥檛 searched for. Seven in 10 of the regret reports were tied to recommended videos, and those recommended by YouTube were 40 per cent more likely to be regretted than videos users actively searched for, say the Mozilla researchers.
Non-English language videos were reportedly 60 per cent more likely to be regretted, which the researchers believe may be because YouTube鈥檚 algorithms are trained on primarily English-language videos.
鈥淭his highlights the need to tailor moderation decisions on a per-country level, and make sure YouTube has expert moderators that know what is happening in each country,鈥 says Savvas Zannettou at the Max Planck Institute for Informatics in Germany.
Geurkink said YouTube鈥檚 lack of transparency over its algorithm is 鈥渦nacceptable鈥, especially after years of research has raised concerns about its impact on society.
A YouTube spokesperson said: 鈥淭he goal of our recommendation system is to connect viewers with content they love and on any given day, more than 200 million videos are recommended on the homepage alone.鈥
The company added it had made changes to its recommendation system in the last year that reduced consumption of 鈥渂orderline content鈥 to less than 1 per cent of all videos.
Topics:



