Published On: Thu, Jul 8th, 2021

Mozilla crowdsourced reports of bad YouTube recommendations

[ad_1]

That the machine learning-driven feed of YouTube recommendations can frequently surface results of an edgy or even radicalizing bent isn’t much of a question anymore. YouTube itself has pushed tools that it says could give users more control over their feed and transparency about certain recommendations, but it’s difficult for outsiders to know what kind of impact they’re having. Now, after spending much of the last year collecting data from the RegretsReporter extension (available for Firefox or Chrome), the Mozilla Foundation has more information on what people see when the algorithm makes the wrong choice and has released a detailed report (pdf).

In September 2020 the extension launched, taking a crowdsourced approach to find “regrettable” content that people encounter via the recommendation engine. After receiving 3,362 reports (along with data from people who installed the extension but did not submit reports), trends in the data show the danger in YouTube’s approach.

While the foundation says it kept the concept of a “regret” vague on purpose, it judged that 12.2 percent of reported videos violated YouTube’s own rules for content, and noted that about nine percent of them (nearly 200 in total) have been removed from YouTube — after accruing over 160 million views. As far as why those videos were posted in the first place, a possible explanation is that they’re popular — Mozilla noted that reported videos averaged 70 percent more views per day than other videos watched by volunteers.

Mozilla senior director of advocacy Brandy Guerkink says “YouTube needs to admit their algorithm is designed in a way that harms and misinforms people.” Still, two stats in particular jumped out to me from the study: Mozilla says “in 43.3 percent of cases where we have data about trails a volunteer watched before a Regret, the recommendation was completely unrelated to the previous videos that the volunteer watched.” Also, the rate of regrettable videos reported was 60 percent higher in countries where English is not a primary language. Despite the small sample size and possible selection bias of the data, it indicates there’s more to look at in places where people who primarily speak English aren’t even paying attention.

NBC News included a statement from YouTube regarding the report that claimed “over the past year alone, we’ve launched over 30 different changes to reduce recommendations of harmful content.” They had a similar response when the project launched last year. Reforms suggested by Mozilla include transparency reports and the ability to opt-out of personalization, but with YouTube pulling in over $6 billion per quarter from advertising, pulling away from profiling seems doubtful.

[ad_2]

Source link

Leave a comment

XHTML: You can use these html tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>