For years, YouTube has been a highly effective megaphone for conspiracy theorists. Photo / 123RF
A new study examines YouTube's efforts to limit the spread of conspiracy theories on its site, from videos claiming the end times are near to those questioning climate change.
Climate change is a hoax, the Bible predicted President Donald Trump's election and Elon Musk is a devil worshipper trying totake over the world.
All of these fictions have found life on YouTube, the world's largest video site, in part because YouTube's own recommendations steered people their way.
For years, it has been a highly effective megaphone for conspiracy theorists, and YouTube, owned and run by Google, has admitted as much. In January 2019, YouTube said it would limit the spread of videos "that could misinform users in harmful ways."
One year later, YouTube recommends conspiracy theories far less than before. But its progress has been uneven and it continues to advance certain types of fabrications, according to a new study from researchers at the University of California, Berkeley.
YouTube's efforts to curb conspiracy theories pose a major test of Silicon Valley's ability to combat misinformation, particularly before this year's elections. The study, which examined 8 million recommendations over 15 months, provides one of the clearest pictures yet of that fight, and the mixed findings show how challenging the issue remains for tech companies like Google, Facebook and Twitter.
The researchers found that YouTube has nearly eradicated some conspiracy theories from its recommendations, including claims that the Earth is flat and that the US government carried out the September 11 terrorist attacks, two falsehoods the company identified as targets last year. In June, YouTube said the amount of time people spent watching such videos from its recommendations had dropped by 50 per cent.
Yet the Berkeley researchers found that just after YouTube announced that success, its recommendations of conspiracy theories jumped back up and then fluctuated over the next several months.
The data also showed that other falsehoods continued to flourish in YouTube's recommendations, like claims that aliens created the pyramids, that the government is hiding secret technologies and that climate change is a lie.
The researchers argue those findings suggest that YouTube has decided which types of misinformation it wants to root out and which types it is willing to allow. "It is a technological problem, but it is really at the end of the day also a policy problem," said Hany Farid, a computer science professor at the University of California, Berkeley, and co-author of the study.
"If you have the ability to essentially drive some of the particularly problematic content close to zero, well then you can do more on lots of things," he added. "They use the word 'can't' when they mean 'won't.'"
Farshad Shadloo, a YouTube spokesman, said the company's recommendations aimed to steer people toward authoritative videos that leave them satisfied. He said the company was continually improving the algorithm that generates the recommendations. "Over the past year alone, we've launched over 30 different changes to reduce recommendations of borderline content and harmful misinformation, including climate change misinformation and other types of conspiracy videos," he said. "Thanks to this change, watchtime this type of content gets from recommendations has dropped by over 70 percent in the U.S."
YouTube's powerful recommendation algorithm, which pushes its 2 billion monthly users to videos it thinks they will watch, has fueled the platform's ascent to become the new TV for many across the world. The company has said its recommendations drive over 70 per cent of the more than 1 billion hours people spend watching YouTube videos each day, making the software that picks the recommendations among the world's most influential algorithms.
Yet that success has come with a dark side. Research has shown that YouTube's recommendations have systematically amplified divisive, sensationalist and clearly false videos. Other algorithms meant to capture people's attention in order to show more ads, like Facebook's newsfeed, have had the same problem.
The stakes are high. YouTube faces an onslaught of misinformation and unsavoury content uploaded daily. The FBI recently identified the spread of fringe conspiracy theories as a domestic terror threat.
Last month, a German man uploaded a screed to YouTube saying that "invisible secret societies" use mind control to abuse children in underground bunkers. He later shot and killed nine people in a suburb of Frankfurt.
To study YouTube, Farid and another Berkeley researcher, Marc Faddoul, teamed up with Guillaume Chaslot, a former Google engineer who helped develop the recommendation engine and now studies it.
Since October 2018, the researchers have collected recommendations that appeared alongside videos from more than 1,000 of YouTube's most popular and recommended news-related channels, making their study among the longest and most in-depth examinations of the topic. They then trained an algorithm to rate, on a scale from 0 to 1, the likelihood that a given video peddled a conspiracy theory, including by analysing its comments, transcript and description.
Like most attempts to study YouTube, the approach has flaws. Determining which videos push conspiracy theories is subjective, and leaving it to an algorithm can lead to mistakes.
To account for errors, the researchers included in their study only videos that scored higher than 0.5 on the likelihood scale. They also discounted many videos based on their rating: Videos with a 0.75 rating, for example, were worth three-quarters of a conspiracy-theory recommendation in the study.
The recommendations were also collected without logging into a YouTube account, which isn't how most people use the site. When logged in, recommendations are personalised based on people's viewing history. But researchers have been unable to re-create personalised recommendations at scale, and as a result have struggled to study them.
That challenge has deterred other studies. Arvind Narayanan, a computer science professor at Princeton University, said that he and his students abandoned research on whether YouTube could radicalise users because they couldn't examine personalised recommendations. Late last year, Narayanan criticized a similar study — which concluded that YouTube hardly radicalised users — because it studied only logged-out recommendations, among other issues.
Narayanan reviewed the Berkeley study at request of The New York Times and said it was valid to study the rate of conspiracy-theory recommendations over time, even when logged out. But without examining personalised recommendations, he said, the study couldn't offer conclusions about the impact on users.
"To me, a more interesting question is, 'What effect does the promotion of conspiracy videos via YouTube have on people and society?'" Narayanan said in an email. "We don't have good ways to study that question without YouTube's cooperation."
Shadloo of YouTube questioned the study's findings because the research focused on logged-out recommendations, which he reiterated doesn't represent most people's experience. He also said the list of channels the study used to collect recommendations was subjective and didn't represent what's popular on the site. The researchers said they chose the most popular and recommended news-related channels.
The study highlights a potpourri of paranoia and delusion. Some videos claim that angels are hidden beneath the ice in Antarctica (1.3 million views); that the government is hiding technologies like levitation (5.5 million views); that photos from the Mars rover prove there was once civilisation on the planet (850,000 views); and that footage of dignitaries reacting to something at George Bush's funeral confirms a major revelation is coming (1.3 million views).
Often the videos run with advertising, which helps finance the creators' next production. YouTube also takes a cut.
Some types of conspiracy theories were recommended less and less through 2019, including videos with end-of-the-world prophecies.
One video viewed 600,000 times and titled "Could Emmanuel Macron be the Antichrist?" claimed there were signs that the French president was the devil. (Some of its proof: He earned 66.06 per cent of the vote.)
In December 2018 and January 2019, the study found that YouTube recommended the video 764 times in the "Up next" playlist of recommendations that appeared alongside videos analyzed in the study. Then the recommendations abruptly stopped.
Videos promoting QAnon, the pro-Trump conspiracy theory that claims "deep state" paedophiles control the country, had thousands of recommendations in early 2019, according to the study. Over the past year, YouTube has sharply cut recommendations of QAnon videos, in part by seemingly avoiding some channels that push the theory.
While YouTube recommends such videos less, it still hosts many of them on its site. For some topics like the moon landing and climate change, it now aims to undercut debunked claims by including Wikipedia blurbs below videos.
Many of the conspiracy theories YouTube continues to recommend come from fringe channels.
Consider Perry Stone, a televangelist who preaches that patterns in the Bible can predict the future, that climate change is not a threat and that world leaders worship the devil. YouTube's recommendations of his videos have steadily increased, steering people his way nearly 8,000 times in the study. Many of his videos now collect hundreds of thousands of views each.
"I am amused that some of the researchers in nonreligious academia would consider portions of my teaching that link biblical prophecies and their fulfilment to this day and age, as a mix of off-the-wall conspiracy theories," Stone said in an email. Climate change, he said, had simply been rebranded: "Men have survived Noah's flood, Sodom's destruction, Pompeii's volcano."
As for the claim that world leaders are "Luciferian," the information "was given directly to me from a European billionaire," he said. "I will not disclose his information nor his identity."