

In a statement from a YouTube spokesperson, YouTube said, “YouTube is a platform for free speech where anyone can choose to post videos, subject to our Community Guidelines, which we enforce rigorously. Watching videos of Hillary Clinton and Bernie Sanders, on the other hand, led to videos featuring “arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. Whether the subject of the original video selected was right-leaning or left-leaning, or even nonpolitical, the algorithm tends to recommend increasingly more extreme videos - escalating the viewer, Tufekci wrote, from videos of Trump rallies to videos featuring “white supremacist rants, Holocaust denials, and other disturbing content.” If this is recommended to me, imagine what goes to people who actually watch this sort of content /tPcOsdbM0r- John Ganz December 11, 2018Īs Zeynep Tufekci, an associate professor at the School of Information and Library Science at the University of North Carolina, wrote in the New York Times in March, the YouTube advertising model is based on you watching as many videos as they can show you (and the ads that appear before and during those videos). But the algorithm knows that I probably don’t want to listen to Nine Inch Nails right now, so it isn’t suggesting, say, Nine Inch Nails’ Broken album.īut YouTube’s algorithms have an extremism problem. And if you search for something, an algorithm decides which videos you get first.įor example, as I write, I am listening to The Nutcracker Suite on YouTube, so YouTube has recommended a list of classical music videos, along with several others based on my viewing history.

If you go to the YouTube homepage, algorithms dictate which videos you see and which ones you don’t. YouTube’s content algorithms are incredibly powerful - they determine which videos show up in your search results, in the suggested videos stream, on the homepage, in the trending stream, and under your subscriptions. About 70 percent of those views come from YouTube’s recommendations, according to Algotransparency, a website that attempts to track “what videos YouTube’s recommendation algorithm most often recommends.” One billion hours’ worth of content is viewed on YouTube every single day. YouTube and conspiracy theories, explained

And it appears that neither Congress nor YouTube is anywhere near solving it. It’s baked into the way the service works. Raskin’s questions were getting at an important issue: YouTube, which Google purchased for $1.65 billion 12 years ago, has a conspiracy theory problem. He asked Pichai, “Is your basic position that is something you want to try to do something about, but basically there is just an avalanche of such material and there’s really nothing that can be done, and it should be buyer beware or consumer beware when you go on YouTube?” He added, “Are you taking the threats seriously? Raskin asked about another especially strange conspiracy theory that emerged on YouTube - “Frazzledrip,” which has deep ties to the QAnon and Pizzagate conspiracy theories. He was alluding to the Pizzagate conspiracy theory that led to an armed gunman showing up at a DC-area pizzeria in 2016 - a conspiracy theory spread, in part, on YouTube. “The point at which it becomes a matter of serious public interest is when your communication vehicle is being used to promote propaganda that leads to violent events,” he said. Jamie Raskin (D-MD), raised an actually important and pressing issue: the way YouTube’s algorithms can be used to push conspiracy theories. Louie Gohmert (R-TX) demanded that Google be held liable for Wikipedia’s “political bias.”īut one lawmaker, Rep. Steve King (R-IA) had to be told that Google does not make the iPhone. One Republican representative complained that all the Google results for the Obamacare repeal act and the Republican tax bill were negative. The three-and-a-half-hour hearing with Google CEO Sundar Pichai and the House Judiciary Committee wasn’t exactly a showcase of deep knowledge of technology.
