Changing YouTube algorithms works. The website suggests less extremist content

Luc Williams

Researchers from the Swiss Federal Institute of Technology in Lausanne and the University of Pennsylvania used two types of bots to recreate the way people interact on the social media platform YouTube.

Recommendation algorithms have been criticized for recommending problematic content

Manoel Horta Ribeiro, one of the study’s co-authors, admitted in an interview with New Scientist that he had long been trying to find an answer to the question of what mechanism could stop YouTube’s algorithms from suggesting further recommendations of extremist content in a sidebar constantly visible to the user.

Before the researchers began the experiment, they trained two bots (each with their own YouTube account) by watching the same sequence of videos to make sure they had the same preferences for the YouTube algorithm. The control bot then followed the video recommendation system for real YouTube users, and the “alternative” bot ignored it completely. Additionally, the “alternative” bot has been trained to follow specific rules designed to separate user behavior from the influence of the algorithm. 88,000 people took part in the study. YouTube users. It turns out that relying solely on the recommendation system results in less radical content being displayed.

In the second experiment, scientists checked how long it takes for far-right videos to disappear from the list of recommended content. “Recommendation algorithms have been criticized for recommending problematic content to users long after they have lost interest in it,” Homa Hosseinmardi of the University of Pennsylvania’s Department of Computer and Information Science, first author of the PNAS paper, told Techxplore.

The algorithm change introduced in 2019 actually works

While the control bots continued to watch far-right content throughout the experiment, the alternative bots “switched” the user from one set of preferences (watching far-right videos) to another (watching moderately partisan videos).

As alternative bots changed their content preferences, researchers tracked the bias of recommended videos on the sidebar and home page. “On average, after watching about 30 videos, the content of recommended content in the sidebar moved towards a moderate profile,” Hosseinmardi said. He added that the recommendations on the home page adapted more slowly to the new profile. “The recommendations displayed on the home page were more in line with user preferences, and the recommendations in the sidebar were more related to the nature of the video currently being watched,” Hosseinmardi noted.

The researchers hope that their tool can be used to study the interaction between user preferences and the artificial intelligence used to create social media platforms. As New Scientist notes, the latest findings published in PNAS contrast with research conducted before 2019, which found that YouTube suggested extremist content more often than Gab, a social networking site popular with the alt-right. This suggests that the algorithm change introduced in 2019 is actually working.

The article convincingly challenges serious and well-documented allegations

“YouTube has made over 30 separate changes to its recommendation system this year. Today, the company claims that the system learns from more than 80 billion different parameters, and its ‘violating view rate’ (VVR) – a measure of what percentage of views across the platform come from videos that violate its own policies – is around 0.1 percent . YouTube says its VVR has dropped since it began tracking the metric in 2017, but didn’t say from what level it had dropped.

The paper, published in PNAS, “persuasively challenges serious and well-documented allegations made over the last six years that YouTube’s algorithms distribute or host extreme and far-right content,” Guy Healy of the Free University of Brussels (VUB) told New Scientist ) in Belgium. He also emphasized that continuous auditing of social media platforms and research on user interactions is very important.

It is also worth noting that as many as 96 percent People spend most of their time on the Internet on anything but consuming information. In December 2023, prof. Magdalena Wojcieszak from the University of California said in an interview with PAP that people look for information about the match result, weather forecast or how to cook a specific dish more often than socio-political information.

Author: Urszula Kaczorowska

About LUC WILLIAMS

Luc's expertise lies in assisting students from a myriad of disciplines to refine and enhance their thesis work with clarity and impact. His methodical approach and the knack for simplifying complex information make him an invaluable ally for any thesis writer.