YouTube

Mozilla says YouTube must open up recommendation algorithm to audits

Gelo Gonzales

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Mozilla says YouTube must open up recommendation algorithm to audits

Mozilla

'YouTube needs to admit their algorithm is designed in a way that harms and misinforms people,' says Brandi Geurkink, Mozilla’s senior manager of advocacy

Mozilla on Wednesday, July 7, published a report detailing how YouTube’s recommendation algorithms remain at the core of the platform’s “rabbit hole” phenomenon that could send users spiraling into conspiracy theories and harmful, false information. It also showed that the recommendation algorithm routinely suggests videos that are potentially violative of YouTube’s content policies.

The findings came via a crowd-sourced Firefox and Chrome extension called RegretsReporter, which allowed users to flag videos on the platform that they feel violate their standards or something that should not be available online. 37,380 users made use of the extension and volunteered their data to be analyzed by Mozilla and experts in online harm, freedom of expression, and tech policy. The investigation took about 10 months to do, said Mozilla.

RegretsReporter stemmed from an earlier Mozilla campaign that asked people what they regretted seeing on YouTube.

Mozilla also said that this is the largest ever crowdsourced investigation into YouTube’s algorithm, which has been kept a close secret despite a general clamor for transparency from users and experts.

What Mozilla found is that the algorithm is a huge part of the problem. Here are some key figures:

  • Recommended videos were 40% more likely to be flagged on RegretsReporter
  • About 9% of the recommended videos that were flagged have been taken down by YouTube, but only after getting 160 million total views – meaning these violative videos are still getting a lot of traction before being taken down
  • In 43.6% of cases where Mozilla had data about the video being watched by a volunteer before it came across a recommended video they flagged, the recommended video was completely unrelated – meaning a user could still get exposed to the wrong video even if they purposely avoid to do so
  • Videos flagged by users tend to perform “extremely well” on the platform, acquiring 70% more views per day than other videos watched by volunteers. This could be because videos trying to spread false propaganda, and conspiracy theories are sensational, and the recommendation algorithm rewards videos that get views, allowing these videos to snowball to millions of views.
  • 71% of all videos that volunteers reported as regrettable were actively recommended by YouTube’s very own algorithm
  • Flagged videos ranged from COVID fear-mongering to political misinformation to wildly inappropriate “children’s” cartoons
  • Flagged videos are 60% higher in countries that do not have English as a primary language

To combat these, Mozilla suggested that platforms should be open to algorithm audits. “Require YouTube and other platforms to release information and create tools that enable researchers to scrutinize their recommendation algorithms through audits and data access provisions,” Mozilla said.

“Policymakers must recognize that YouTube and other platforms are not stepping up to provide this much-needed transparency voluntarily, and that regulatory interventions are necessary,” it added.

Must Read

Protonmail, DuckDuckGo, privacy coalition call for ban on surveillance-based ads

Protonmail, DuckDuckGo, privacy coalition call for ban on surveillance-based ads

Its other recommendations include:

  • Platforms should publish frequent and thorough transparency reports that include information about their recommendation algorithms
  • Platforms should provide people with the option to opt-out of personalized recommendations
  • Platforms should create risk management systems devoted to recommendation AI

“YouTube needs to admit their algorithm is designed in a way that harms and misinforms people,” said Brandi Geurkink, Mozilla’s senior manager of advocacy. “Our research confirms that YouTube not only hosts, but actively recommends videos that violate its very own policies. We also now know that people in non-English speaking countries are the most likely to bear the brunt of YouTube’s out-of-control recommendation algorithm.”

“Mozilla hopes that these findings – which are just the tip of the iceberg –will convince the public and lawmakers of the urgent need for better transparency into YouTube’s AI.”

The full report can be found here. – Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Clothing, Apparel, Person

author

Gelo Gonzales

Gelo Gonzales is Rappler’s technology editor. He covers consumer electronics, social media, emerging tech, and video games.