Video sharing and social media platform YouTube must be more transparent and clear about their policies on disinformation and misinformation on their website, fact-checkers at a global conference said on Friday, October 22.
At a panel titled “What does the fact-checking community want from YouTube?,” Mozilla Senior Manager of Advocacy Brandi Geurkink shared three findings of research that Mozilla conducted on how YouTube’s algorithm surfaces content that is credible or harmful.
Mozilla created a browser extension to crowdsource examples of videos that YouTube recommended and people later regretted watching. They recruited more than 37,000 people across 191 countries to participate in the research.
These were the findings Geurkink outlined at the panel:
- Most of the videos people regret watching came from recommendations, which suggests that the algorithm is a big source of concern for YouTube’s user base.
- The algorithm recommends videos that violate their own policies. For example, recommended videos that were later removed from the platform because they were in violation of policies on medical misinformation. In these cases however, YouTube does not always provide a reason for why a video was taken down.
- Non-English-speaking countries suffer the most from YouTube’s lack of transparency. There was a higher “rate of regret” among countries where English is not the first language.
Fact-checkers at the panel also brought up that it isn’t always immediately clear to the public that the platform’s policy rollouts only apply to certain regions like the US or other English-speaking countries. Moreover, it’s hard for non-native English speakers to fact-check videos because there are not a lot of automated transcription tools for other languages.
The panel was held at Global Fact 8, the only annual conference dedicated to fact-checking worldwide. The conference opened on October 20 and will run until October 23.
Full Fact’s Phoebe Arnold, who moderated the panel, said that YouTube doesn’t do a lot to engage with fact-checkers and has escaped criticism due to the US media’s scrutiny of Facebook. YouTube was invited to this #GlobalFact8 panel, but no one from the company could join.
There’s a lot of room for cooperation between YouTube and fact-checking organizations, the panel said, when it comes to transparency. “YouTube is one of the least transparent social media platforms, but they don’t really have that image in comparison to Facebook,” said Mozilla’s Brandi Geurkink.
There is also room for cooperation among fact-checkers, said Arnold, Correctiv’s David Schraven, and Rappler’s Gemma Mendoza, who all agreed that fact-checkers should work together to demand YouTube to be better.
“It should be happening, we should be collaborating, we should be…joining voices together and saying it louder, so people will hear, people in these platforms will listen, and those in authority will then also listen, because things are happening. Because this is not just a question of entertainment. This is not just a question of content. This is a question of societies and democracies that are being affected,” said Mendoza.
YouTube in August removed over 1 million videos about COVID-19 disinformation and then blocked all anti-vaccine content on its platform in September. These policies, however, give no context or explanation for the removal of the false information and might lead to more confusion or misinformation.
Schraven instead recommended fact-check videos to be played before a video containing inaccurate information, the same way YouTube plays ads. Fact-checks in video format should be one of the changes YouTube implements in a year’s time, said Schraven, and they should compensate fact-checkers fairly if they do ask them to produce videos that debunk false claims.
“They created the mess, they should pay for cleaning it up,” he said. – Rappler.com