Don’t miss the latest developments in business and finance.

What goes behind the radicalisation of men and women on YouTube

YouTube has been a godsend for hyper-partisans on all sides. It has allowed them to bypass traditional gatekeepers and broadcast their views to mainstream audiences

YouTube
Kevin Roose | NYT
5 min read Last Updated : Jun 09 2019 | 12:34 AM IST
Caleb Cain was a college dropout looking for direction. He turned to YouTube.

Soon, he was pulled into a far-right universe, watching thousands of videos filled with conspiracy theories, misogyny and racism. "I was brainwashed."

Caleb Cain pulled a Glock pistol from his waistband, took out the magazine and casually tossed both onto the kitchen counter. "I bought it the day after I got death threats," he said.

The threats, Mr. Cain explained, came from right-wing trolls in response to a video he had posted on YouTube a few days earlier. In the video, he told the story of how, as a liberal college dropout struggling to find his place in the world, he had gotten sucked into a vortex of far-right politics on YouTube.

"I fell down the alt-right rabbit hole," he said in the video.

Mr. Cain, 26, recently swore off the alt-right nearly five years after discovering it, and has become a vocal critic of the movement. He is scarred by his experience of being radicalized by what he calls a "decentralized cult" of far-right YouTube personalities, who convinced him that Western civilization was under threat from Muslim immigrants and cultural Marxists, that innate I.Q. differences explained racial disparities, and that feminism was a dangerous ideology.

"I just kept falling deeper and deeper into this, and it appealed to me because it made me feel a sense of belonging," he said. "I was brainwashed."

Over years of reporting on internet culture, I've heard countless versions of Cain's story: an aimless young man — usually white, frequently interested in video games — visits YouTube looking for direction or distraction and is seduced by a community of far-right creators.

Some young men discover far-right videos by accident, while others seek them out. Some travel all the way to neo-Nazism, while others stop at milder forms of bigotry.
 
The common thread in many of these stories is YouTube and its recommendation algorithm, the software that determines which videos appear on users' home pages and inside the "Up Next" sidebar next to a video that is playing. The algorithm is responsible for more than 70 percent of all time spent on the site.

The radicalization of young men is driven by a complex stew of emotional, economic and political elements, many having nothing to do with social media. But critics and independent researchers say YouTube has inadvertently created a dangerous on-ramp to extremism by combining two things: a business model that rewards provocative videos with exposure and advertising dollars, and an algorithm that guides users down personalized paths meant to keep them glued to their screens.

"There's a spectrum on YouTube between the calm section - the Walter Cronkite, Carl Sagan part - and Crazytown, where the extreme stuff is," said Tristan Harris, a former design ethicist at Google, YouTube's parent company. "If I'm YouTube and I want you to watch more, I'm always going to steer you toward Crazytown."
 
Steven Crowder, a conservative commentator, has gained nearly four million subscribers like Mr. Cain with shock-jock antics like this parody, which drew from a widely recognized "Schoolhouse Rock" cartoon. 

In recent years, social media platforms have grappled with the growth of extremism on their services. Many platforms have barred a handful of far-right influencers and conspiracy theorists, including Alex Jones of Infowars, and tech companies have taken steps to limit the spread of political misinformation.

YouTube, whose rules prohibit hate speech and harassment, took a more laissez-faire approach to enforcement for years. This past week, the company announced that it was updating its policy to ban videos espousing neo-Nazism, white supremacy and other bigoted views. The company also said it was changing its recommendation algorithm to reduce the spread of misinformation and conspiracy theories.

With two billion monthly active users uploading more than 500 hours of video every minute, YouTube's traffic is estimated to be the second highest of any website, behind only Google.com. According to the Pew Research Center, 94 percent of Americans ages 18 to 24 use YouTube, a higher percentage than for any other online service.

Like many Silicon Valley companies, YouTube is outwardly liberal in its corporate politics. It sponsors floats at L.G.B.T. pride parades and celebrates diverse creators, and its chief executive endorsed Hillary Clinton in the 2016 presidential election. President Trump and other conservatives have claimed that YouTube and other social media networks are biased against right-wing views, and have used takedowns like those announced by YouTube on Wednesday as evidence for those claims.
In reality, YouTube has been a godsend for hyper-partisans on all sides. It has allowed them to bypass traditional gatekeepers and broadcast their views to mainstream audiences, and has helped once-obscure commentators build lucrative media businesses.
'
It has also been a useful recruiting tool for far-right extremist groups. Bellingcat, an investigative news site, analyzed messages from far-right chat rooms and found that YouTube was cited as the most frequent cause of members' "red-pilling" — an internet slang term for converting to far-right beliefs. 
©2019 The New York Times News Service
Next Story