If YouTube’s algorithms radicalize people, it’s hard to tell from the data

DMCA / Correction Notice
- Advertisement -


- Advertisement -

We’ve all seen it happen: Watch a video on YouTube and your recommendations change, just as Google’s algorithms think the subject of the video is the passion of your life. Suddenly, all of the recommended videos you submit—and probably many ads—are on topic.

Mostly, the results are comical. but there has been steady stream of stories About how this process has radicalized people, sending them down a relentlessly deepening rabbit hole, until all their viewing is dominated by fringe ideas and conspiracy theories.

advertisement

A new study released Monday looks at whether these stories represent a larger trend or are just a collection of anecdotes. While the data cannot rule out the existence of online bigotry, it certainly suggests that it is not the most common experience. Instead, it seems that fringe ideas are part of a larger self-reinforcing community.

big Data

Typically, the challenge of conducting a study like this is to obtain data on people’s video-watching habits without those people knowing it – and potentially changing their behavior accordingly. The researchers worked around this issue by getting data from Nielsen, which simply keeps track of what people are seeing. People allow Nielsen to track their habits, and the firm anonymizes the resulting data. For this study, researchers obtained data from more than 300,000 viewers who collectively watched more than 21 million videos on YouTube during the period from 2016 to the end of 2019.

- Advertisement -

Most of these videos had nothing to do with politics, so the authors used literature to identify a large collection of channels that previous research had labeled according to their political leanings, from left to right. was from mid-way to right. To that list, the researchers added a category they called “anti-voke.” While they are not always openly political, a growing collection of channels focus on “opposition to progressive social justice movements”. While those channels align with right-wing interests, views are often not presented as such by the hosts of the videos.

All told, the channels the researchers classified (of just under 1,000) accounted for only 3.3 percent of total video views during this period. And those who saw them cling to the same type of material; Even if you started watching left-leaning material in 2016, you would still be seeing it when the study period ended in 2020. In fact, based on the amount of time spent per video, you were more likely to watch that content. In 2020, perhaps as a product of Trump years of controversy.

(The exception to this is far-left material, which was observed so frequently that it was impossible to pick out statistically significant trends in most cases.)

Nearly all off-limits content types also saw an increase over this period, both in terms of total viewership and the time taken to watch videos on these channels (excluding far-left and far-right content). This finding suggests that at least some trends reflect the increasing use of YouTube as an alternative to more traditional broadcast media.

trends

Since viewers have mostly seen the same type of content, it’s easiest to think about grouping them apart. The researchers tracked the number of people who belonged to each group, as well as the amount of time they spent watching the videos over a four-year period.

Throughout that time, the rest of the mainstream was almost as large as the other groups combined; Then there were the centrists. The mainstream right wing and anti-awakening started this period at almost the same level as the far right. But they all showed different trends. Total right-wing viewership remained steady, but the amount of time they spent watching videos increased. Conversely, mainstream-right viewership increased overall, but the amount of time they spent watching was not much different from that of the far right.

Aware viewers showed the highest rate of growth of any group. By the end of the period, they spent more time watching videos than midgetters, even though their population remained low.

Do any of these represent bigotry? The lack of significant growth at the two ends suggests that there is no major YouTube viewing trend that pushes people too far or to the right. In fact, the researchers found evidence that many people on the far right were using YouTube as a part of the ecosystem of sites they were engaged in. (Again, the left-most one was too small to analyze.) Far-right viewers were more likely to reach right-wing videos via links to right-wing websites than other videos.

Also, there are no signs of any uptick. If YouTube’s algorithms keep directing people to more extreme videos, the frequency of far-flung videos should increase by the end of the viewing session. It didn’t happen—in fact, the opposite happened.

Sticky, But Not Radical

The researchers noted, however, that far-flung content was slightly more viscous, viewers spent more time on it, even though the far-right audience community did not grow significantly. Anti-Voque content was still more viscous and saw the largest increase in viewership. In addition, people who watched multiple anti-awareness videos in one session were more likely to continue watching them in the future.

Although the anti-awareness videos did not present themselves as openly political, their viewers tended to consider them right-wing based on their integration with the larger ecosystem of right-wing websites. This did not promote fanaticism, however—having a more aware audience ultimately did not produce a more far-right audience.

Although the researchers found no evidence that YouTube fanaticism is driving, the work has some clear limitations. For one, it only tracks desktop browser usage, so it missed out on mobile viewing. The researchers also couldn’t determine what YouTube’s algorithms actually recommend, so they can only estimate the actual response to recommendations based on overall behavior. And as always, the average behavior of users can obscure some dramatic exceptions.

“On a platform with about 2 billion users, it is possible to find examples of almost any type of behavior,” as the researchers noted.

PNAS, 2021. DOI: 10.1073/pn.2101967118 (About DOI).

- Advertisement -

Stay on top - Get the daily news in your inbox

Recent Articles

Related Stories