The Interpreter

As Germans Seek News, YouTube Delivers Far-Right Tirades

A far-right protest in Chemnitz, Germany, last month. People searching on YouTube for information about the riots there were directed toward extremist videos about the riots — and then on to far-right videos on other subjects.CreditCreditJens Meyer/Associated Press

CHEMNITZ, Germany — The day after far-right demonstrators took over the streets here, Sören Uhle, a city official who oversees municipal marketing and development, began to get strange phone calls from reporters.

The man whose killing had set off the riots, they said, had died while trying to stop asylum seekers from molesting a local woman. And it wasn’t just one local man who had been killed, but two. Could he comment?

These sorts of accusations suddenly seemed to be everywhere. But none were true. They had come, Mr. Uhle and others suspected, from social media — particularly YouTube.

Ray Serrato, a Berlin-based digital researcher, noticed the tide of misinformation when his wife’s uncle showed him a YouTube video that claimed the rioters had been Muslim refugees.

The video, posted by an obscure fringe group, was rambling, and it appeared to be cheaply produced. Yet it had nearly half a million views — far more than any news video on the riots. How was that possible?

Mr. Serrato scraped YouTube databases for information on every Chemnitz-related video published this year. He found that the platform’s recommendation system consistently directed people toward extremist videos on the riots — then on to far-right videos on other subjects.

Users searching for news on Chemnitz would be sent down a rabbit hole of misinformation and hate. And as interest in Chemnitz grew, it appears, YouTube funneled many Germans to extremist pages, whose view counts skyrocketed.

Activists say this may have contributed to a flood of misinformation, helping extremists shape public perceptions even after they had been run off Chemnitz’s streets.

“This was new,” Mr. Uhle said. “It’s never happened to me before that mainstream media, big German newspapers and television channels, ask me about false news and propaganda that had clearly become so pervasive that people just bought it.”

Researchers who study YouTube say the episode, far from being isolated, reflects the platform’s tendency to push everyday users toward politically extreme content — and, often, to keep them there.

A YouTube spokeswoman declined to comment on the accusations, saying the recommendation system intended to “give people video suggestions that leave them satisfied.” She said the company planned to work with news publishers to help “build a better news experience on YouTube.”

Though YouTube has typically drawn less scrutiny than other social networks, that may be changing. Its parent company, Google, faced criticism from American lawmakers this week for declining to send its chief executive to congressional hearings attended by chief executives from Twitter and Facebook.

A closed system

YouTube’s recommendation system is the core of its business strategy: Getting people to click on one more video means serving them more ads. The algorithm is sophisticated, constantly learning what keep users engaged. And it is powerful. A high ranking from the algorithm can mean huge audiences for a video.

Mr. Serrato wondered if that explained how his family member had discovered the conspiracy video. He had read studies about users who blindly followed the recommendation system; inevitably, they seemed to end up watching long series of far-left or far-right videos.

Zeynep Tufekci, a prominent social media researcher at the University of North Carolina at Chapel Hill, has written that these findings suggest that YouTube could become “one of the most powerful radicalizing instruments of the 21st century.”

But, as Ms. Tufekci and other researchers stress, such experiments are anecdotal.

Mr. Serrato wanted to get a fuller picture of how YouTube shapes perceptions of events. So he conducted something known as a network analysis, applying techniques he had used in his day job, as an analyst with Democracy Reporting International, a respected global governance monitor, to track hate speech in Myanmar.

Using YouTube’s public developer interface, Mr. Serrato plugged in a dozen recent videos related to Chemnitz. For each, he scraped YouTube’s recommendations for what to watch next. Then he did the same for those videos, and so on. Eventually, he identified a network of about 650 videos, nearly all from this year.

The results, he said, were disturbing. The network showed a tight cluster of videos that Mr. Serrato identified as predominantly conspiracy theorist or far right.

This was the first sign that YouTube’s algorithm systemically directs users toward extremist content. A more neutral algorithm would most likely produce a few distinct clusters of videos — one of mainstream news coverage, another of conspiracy theories, another of extremist groups. Those who began in one cluster would tend to stay there.

Instead, the YouTube recommendations bunched them all together, sending users through a vast, closed system composed heavily of misinformation and hate.

Viewers who come to YouTube for down-the-middle news may quickly find themselves in a world of extremists, Mr. Serrato said.

“That’s what I found bizarre,” he said. “Why are they so close together, unless the aim is to trigger a reaction?” Content that engages viewers’ emotions or curiosity, he suspected, would hook them in.

And it wasn’t just that the platform directed people to unreliable videos about the subject they had sought out — in this case, Chemnitz.

A memorial in Chemnitz, Germany, near where Daniel Hillig was fatally stabbed. Anger exploded after word spread that an Iraqi and a Syrian asylum seeker were suspected of carrying out the attack.CreditSean Gallup/Getty Images

Many of the videos in Mr. Serrato’s analysis were unrelated to Chemnitz. Some offered positive portrayals of white nationalism in general or of Alternative for Germany, a far-right political party. Others went further astray, detailing fringe conspiracies; one argues that President Trump is a pawn of the Rothschild banking family.

Why would YouTube surface videos like these in a search for news stories?

How many steps are there on YouTube’s algorithm from news story to fever swamp? “Only two,” Mr. Serrato said. “By the second, you’re quite knee-deep in the alt-right.”

Perhaps most striking is what was absent. The algorithm rarely led back to mainstream news coverage, or to liberal or centrist videos on Chemnitz or any other topic. Once on the fringes, the algorithm tended to stay there, as if that had been the destination all along.

From fringe to mainstream

Activists and residents in Chemnitz say far-right conspiracy theories seemed unusually common in the days before and after the demonstration.

Oliver Flesch, a far-right figure on YouTube, posted a series of videos misrepresenting the killing that set off the riots, with titles like “German Stabbed to Death Just Because He Wanted to Help Our Women.” Another claimed the asylum seekers had killed two Germans.

Mr. Flesch, who has 20,000 subscribers, operates in a political bubble. Yet his claims had filtered into the mainstream enough that journalists asked Mr. Uhle, the Chemnitz official, about them. How?

The algorithm may have helped. Mr. Serrato’s network analysis led to 16 of Mr. Flesch’s videos, and to five by the obscure right-wing rapper Chris Ares, with whom Mr. Flesch sometimes does guest spots.

And misinformation can travel in other ways. Some German officials said this week that a widely circulated video, appearing to show a far-right activist chasing a dark-skinned person during Chemnitz’s riots, may have been faked.

Thomas Hoffmann, who helps run a local refugee organization, was on a train from Hamburg when the riots broke out. So he searched YouTube for “Chemnitz” and the date, hoping to follow the events.

Instead, the platform returned obvious forgeries. One video of dark-skinned residents being attacked was edited to make them look like the aggressors. Others were interspersed with footage from previous rallies, to make this one look more peaceful than it was.

“It was incredible how much blatantly doctored material there was,” Mr. Hoffmann said. “When you click on one video, whether you like it or not, another one is proposed that features content from far-right conspiracy theories.”

Is YouTube worse than others?

YouTube has been more cooperative with the German authorities about removing hate speech than other social media companies, said Flemming Ipsen, who tracks political extremism at, a government-linked internet monitor.


Björn Höcke, front center, a member of the far-right party Alternative for Germany, at a demonstration in Chemnitz last week.CreditJens Meyer/Associated Press

But some researchers consider YouTube to be unusually permissive about content that it does not consider overt hate speech, and its algorithm unusually aggressive in pushing users toward political fringes. YouTube also designates some content as borderline, neither blocking nor promoting it.

YouTube says it does not code videos by political content, but rather by viewer interest. Critics say that leads the platform to surface fringe material that reliably wins more clicks.

Mr. Serrato said that even while researching videos he found abhorrent, he was unable to resist.

“As soon as I was on one of the videos, I thought, O.K., now I’m going to watch the next one,” he said. “That’s YouTube’s goal. I stay engaged, ads play. And it works.”

Guillaume Chaslot, a former engineer at YouTube’s owner, Google, said that during and after high-profile political events like the demonstration in Chemnitz, extremism and misinformation often spike on the platform. But this reflects deeper tendencies in YouTube’s algorithm, he said.

“The example I like to cite is the flat-earth theory, because it’s apolitical,” Mr. Chaslot said. Videos claiming the earth is flat, he said, are “still going viral, still getting highly recommended by the YouTube algorithm, because it gets watch time.”

Mr. Chaslot worked on YouTube’s algorithm until 2013, when he was fired. Google has said he was fired for poor performance, Mr. Chaslot has cited disagreements over the company’s direction.

Now, he studies the algorithm from outside, most recently analyzing its recommendations during the 2016 presidential campaign. As in Chemnitz, he found that YouTube’s suggestions consistently nudged users into extremist content.

“YouTube doesn’t give you a straight representation of the world,” Mr. Chaslot said. “It looks like reality, but it deforms reality, because it is biased toward watch time.”

Even in Germany, which has some of the toughest social media restrictions of any democracy, officials say they have little power to regulate the vast majority of social media content.

“Lies, propaganda and manipulation are harmful for society, but on their own are not illegal — and so our hands are often tied,” said Mr. Ipsen, of the government-linked internet monitor.

German officials have urged social media companies to make their algorithms more transparent. American lawmakers have done the same, citing research linking social media to polarization, foreign meddling and hate speech.

But the companies are refusing.

“The algorithm is central to their business model,” Mr. Ipsen said. “All we can do is remind them of their social responsibility.”

The Interpreter is a column by Max Fisher and Amanda Taub exploring the ideas and context behind major world events. Follow them on Twitter @Max_Fisher and @amandataub.

Christopher F. Schuetze contributed reporting from Berlin.

Let’s block ads! (Why?)