YouTube is the biggest social media platform in the country, and, perhaps, the most misunderstood. Over the past few years, the Google-owned platform has become a media powerhouse where political discussion is dominated by right-wing channels offering an ideological alternative to established news outlets. And, according to new research from Penn State University, these channels are far from fringeâtheyâre the new mainstream, and recently surpassed the big three US cable news networks in terms of viewership.
The paper, written by Penn State political scientists Kevin Munger and Joseph Phillips, tracks the explosive growth of alternative political content on YouTube, and calls into question many of the fieldâs established narratives. It challenges the popular school of thought that YouTubeâs recommendation algorithm is the central factor responsible for radicalizing users and pushing them into a far-right rabbit hole.
The authors say that thesis largely grew out of media reports, and hasnât been rigorously analyzed. The best prior studies, they say, havenât been able to prove that YouTubeâs algorithm has any noticeable effect. âWe think this theory is incomplete, and potentially misleading,â Munger and Phillips argue in the paper. âAnd we think that it has rapidly gained a place in the center of the study of media and politics on YouTube because it implies an obvious policy solutionâone which is flattering to the journalists and academics studying the phenomenon.â
Instead, the paper suggests that radicalization on YouTube stems from the same factors that persuade people to change their minds in real lifeâinjecting new informationâbut at scale. The authors say the quantity and popularity of alternative (mostly right-wing) political media on YouTube is driven by both supply and demand. The supply has grown because YouTube appeals to right-wing content creators, with its low barrier to entry, easy way to make money, and reliance on video, which is easier to create and more impactful than text.
âThis is attractive for a lone, fringe political commentator, who can produce enough video content to establish themselves as a major source of media for a fanbase of any size, without needing to acquire power or legitimacy by working their way up a corporate media ladder,â the paper says.
According to the authors, that increased supply of right-wing videos tapped a latent demand. âWe believe that the novel and disturbing fact of people consuming white nationalist video media was not caused by the supply of this media âradicalizingâ an otherwise moderate audience,â they write. âRather, the audience already existed, but they were constrainedâ by limited supply.
Other researchers in the field agree, including those whose work has been cited by the press as evidence of the power of YouTubeâs recommendation system. Manoel Ribeiro, a researcher at the Swiss Federal Institute of Technology Lausanne and one of the authors of what the Penn State researchers describe as âthe most rigorous and comprehensive analysis of YouTube radicalization to date,â says that his work was misinterpreted to fit the algorithmic radicalization narrative by so many outlets that he lost count.
For his study, published in July, Ribeiro and his coauthors examined more othan 330,000 YouTube videos from 360 channels, mostly associated with far right ideology. They broke the channels into four groups, based on their degree of radicalization. They found that a YouTube viewer who watches a video from the second-most-extreme group and follows the algorithmâs recommendations has only a 1-in-1,700 chance of arriving at a video from the most extreme group. For a viewer who starts with a video from the mainstream media, the chance of being shown a video from the most extreme group is roughly 1 in 100,000.
Munger and Phillips cite Ribeiroâs paper in their own, published earlier this month. They looked at 50 YouTube channels that researcher Rebecca Lewis identified in a 2018 paper as the âAlternative Influence Network.â Munger and Phillipsâ reviewed the metadata for close to a million YouTube videos posted by those channels and mainstream news organizations between January 2008 and October 2018. The researchers also analyzed trends in search rankings for the videos, using YouTubeâs API to obtain snapshots of how they were recommended to viewers at different points over the last decade.
Munger and Phillips divided Lewisâs Alternative Influence Network into five groupsâfrom âLiberalsâ to âAlt-rightââbased on their degree of radicalization. Liberals included channels by Joe Rogan and Steven Bonnell II. âSkepticsâ included Carl Benjamin, Jordan Peterson, and Dave Rubin. âConservatives,â included YouTubers like Steven Crowder, Dennis Prager of PragerU, and Ben Shapiro. The âAlt-Liteâ category included both fringe creators that espouse more mainstream conservative views, like InfoWarsâ Paul Joseph Watson, and those that express more explicitly white nationalist messages, like Stefan Molyneux and Lauren Southern. The most exteme category, the âAlt-Right,â refers to those who push strong anti-Semitic messages and advocate for the genetic superiority of white people, including Richard Spencer, Red Ice TV, and Jean-Francois Gariepy.
Munger and Phillips found that every part of the Alternative Influence Network rose in viewership between 2013 and 2016. Since 2017, they say, global hourly viewership of these channels âconsistently eclipsedâ that of the top three US cable networks combined. To compare YouTubeâs global audience with the cable networksâ US-centric audience, the researchers assumed that each cable viewer watched all three networks for 24 hours straight each day, while each YouTube viewer watched a single video for only 10 minutes.
Overall viewership for the Alternative Influence Network has exploded in recent years, mirroring the far-rightâs real-world encroachment on the national stage. But the report found that viewership on YouTube of the most extreme far-right contentâthose in the Alt-Lite and Alt-Right groups, specificallyâhas actually declined since 2017, while videos in the Conservative category more than doubled in popularity.
Lewis says that the decline could be explained by changes in the universe of right-wing video creators. Some of the creators she included in the list of Alternative Influence Network channels have lost popularity since her study was published, while others have emerged to take their place. However, this latter group was not included in the Penn State researchers' report. Munger said the findings are preliminary and part of a working paper.
Nonetheless, Lewis praises the Penn State paper as essential reading for anyone studying YouTube politics. She lauded it as the first quantitative study on YouTube to shift focus from the recommendation algorithmâa transition that she says is crucial. Ribeiro agrees, describing it as a fascinating and novel perspective that he believes will encourage broader scholarly analysis in the field.
One thing thatâs clear is that the remaining viewers of Alt-Right videos are significantly more engaged than other viewers, based on an analysis of ratio of likes and comments per video views.
Munger and Phillips say they were inspired to illustrate the complexity of YouTubeâs alternative political ecosystem, and to encourage the development of more comprehensive, evidence-based narratives to explain YouTube politics.
âFor these far-right groups, the audience is treating it much more as interactive space," said Munger, in reference to the engagement graph above. âAnd this could lead to the creation of a community,â which is a much more potent persuasive force than any recommendation system. When it comes to radicalization, he says, these are the sorts of factors we should be concerned aboutânot the effects of each algorithmic tweak.
Do you know more about YouTube? Email Paris Martineau at paris_martineau@wired.com. Signal: +1 (267) 797-8655. WIRED protects the confidentiality of its sources, but if you wish to conceal your identity, here are the instructions for using SecureDrop. You can also mail us materials at 520 Third Street, Suite 350, San Francisco, CA 94107.
- The first smartphone war
- 7 cybersecurity threats that can sneak up on you
- âForever chemicalsâ are in your popcornâand your blood
- EVs fire up pyroswitches to cut risk of shock after a crash
- The spellbinding allure of Seoul's fake urban mountains
- ð Prepare for the deepfake era of video; plus, check out the latest news on AI
- ⨠Optimize your home life with our Gear teamâs best picks, from robot vacuums to affordable mattresses to smart speakers.