YouTube, hungry for views, reportedly allowed disturbing, conspiratorial videos to thrive

YouTube, hungry for views, reportedly allowed disturbing, conspiratorial videos to thrive
Photo: Chris McGrath

Fires are flaring just about everywhere these days, but one crisis that can’t be ignored is the proliferation of disturbing and conspiratorial content on YouTube, a platform that’s viewed by millions of young, impressionable minds every hour. White nationalist propaganda, anti-vaccination garbage, nightmarish kids content, and “soft-core pedophile rings” are just a few of the things poisoning the platform, and, though the company has taken a few steps towards moderating its content, Bloomberg reports it’s only doing so begrudgingly.

To be clear, the issue isn’t relegated to just the presence of these videos, but, as Bloomberg puts it, that YouTube “allows the nonsense to flourish.” Speaking of the company’s algorithms, it continues, “[I]n some cases, through its powerful artificial intelligence system, it even provides the fuel that lets it spread.”

The damning report quotes 20 people who work at, or recently left, YouTube, including five senior personnel who left the company due to “the platform’s inability to tame extreme, disturbing videos.” By their account, YouTube has routinely ignored calls to moderate and address the spread of this content—dubbed “bad virality” by software engineers—so as not to impact their engagement metrics.

As Bloomberg puts it:

In recent years, scores of people inside YouTube and Google, its owner, raised concerns about the mass of false, incendiary and toxic content that the world’s largest video site surfaced and spread. One employee wanted to flag troubling videos, which fell just short of the hate speech rules, and stop recommending them to viewers. Another wanted to track these videos in a spreadsheet to chart their popularity. A third, fretful of the spread of “alt-right” video bloggers, created an internal vertical that showed just how popular they were. Each time they got the same basic response: Don’t rock the boat.

YouTube declined to comment on a majority of the article’s claims, but they did provide this e-mailed statement. “Our primary focus has been tackling some of the platform’s toughest content challenges,” it reads. “We’ve taken a number of significant steps, including updating our recommendations system to prevent the spread of harmful misinformation, improving the news experience on YouTube, bringing the number of people focused on content issues across Google to 10,000, investing in machine learning to be able to more quickly find and remove violative content, and reviewing and updating our policies—we made more than 30 policy updates in 2018 alone. And this is not the end: responsibility remains our number one priority.”

Contrast that against reports that YouTube urged staff not to search for or moderate questionable content on their own, lest they jeopardize its legal standing.

Yet, in the past, YouTube actively dissuaded staff from being proactive. Lawyers verbally advised employees not assigned to handle moderation to avoid searching on their own for questionable videos, like viral lies about Supreme Court Justice Ruth Bader Ginsburg, according to one former executive upset by the practice. The person said the directive was never put in writing, but the message was clear: If YouTube knew these videos existed, its legal grounding grew thinner. Federal law shields YouTube, and other tech giants, from liability for the content on their sites, yet the companies risk losing the protections of this law if they take too active an editorial role.

Bloomberg’s reporting elaborates on a culture that forced those who were overseeing content policies to “fight tooth and nail” for resources. Furthermore, a cursory search of some of the platform’s most dangerous channels shows that the changes that have rolled out are sporadic at best.

“YouTube should never have allowed dangerous conspiracy theories to become such a dominant part of the platform’s culture,” says former YouTube employee Micah Schaffer, who joined the company in 2006. Speaking of those early days, he says, “We may have been hemorrhaging money. But at least dogs riding skateboards never killed anyone.”

Read Bloomberg’s full piece here.

 
Join the discussion...