Youtube’s Algorithm recommends unpleasant videos: Mozilla


Posted on 09 July 2021


For Illustration Purpose Only.

People in non-English speaking nations had a 60 percent higher probability of encountering upsetting films, according to the 10-month crowdsourced investigation, which Mozilla claims is the largest-ever. RegretsReporter, an open-source browser plugin that allows consumers to voluntarily volunteer their data, was used to offer academics with access to a pool of YouTube suggestion data. According to the data analysed, YouTube's algorithm actively promoted 71 percent of the videos that project volunteers rated as unpleasant. YouTube has taken down about 200 of these films, which had a total of 160 million views before being taken down.

Mozilla discovered that recommended videos were 40 percent more likely than videos searched for to be regretted, and that in 43.6 percent of cases where Mozilla had data on videos a volunteer watched before a regret, the recommendation was completely unrelated to the previous videos the volunteer had watched.


Key Points


  • People in non-English speaking nations had a 60 percent higher probability of encountering upsetting films, according to the 10-month crowdsourced investigation, which Mozilla claims is the largest-ever.

  • ouTube has taken down about 200 of these films, which had a total of 160 million views before being taken down.



Follow Us On Google Newsstand: Click Here