Blatantly false conspiracy theories will no longer be allowed on YouTube. Photo/Getty Images.
Bigfoot, UFOs, 9/11, JFK.
From aliens to government cover-ups, conspiracy theories have been ubiquitous in popular culture, media and entertainment for decades. More recently, the rise of the internet and online forums has led to an explosion in conspiracy theory content.
While belief in conspiracy theories runs the gamut from casual X-Files fans to hardcore, tinfoil hat wearing believers, opinion polls regularly highlight their appeal.
More than 60 per cent of Americans suspect the government isn't telling the truth about the JFK assassination, for example, and 50 per cent doubt the official explanation of the 9/11 attacks. In Australia, nearly one in four believe aliens have visited the earth.
It was unsurprising, then, that YouTube's crackdown on conspiracy theory content announced last week — which came following sustained political pressure — sparked outrage and cries of censorship from some corners of the internet.
The Google-owned online video monopoly said it was tweaking its recommendation algorithm to reduce the spread of potentially "harmful" content, such as videos that promote a "phony miracle cure", claim the earth is flat or spread "blatantly false claims about historic events like 9/11".
"9/11 conspiracy content completely censored on YouTube," wrote one user on Reddit's popular conspiracy forum, which has nearly 800,000 subscribers.
"As of today, YouTube is pulling the veil of censorship over countless hours of research and information hosted on their platform. Go to YouTube and type in '9/11' and scroll. Keep scrolling. Scroll forever. You will never again find a 9/11 truth video in the search results, in the suggested video bar, anywhere. All conspiracy content is essentially unlisted on the platform now. The only way to find this content now is to already have the link."
While that's not quite correct — it's still possible to find plenty of conspiracy videos and the company insists it is not removing content from the platform as long as it complies with "Community Guidelines".
"As part of our ongoing efforts to improve the user experience across our site, we'll begin reducing recommendations of borderline content or videos that could misinform users in harmful ways," a YouTube spokeswoman said in a statement.
"We will be making this change gradually as we improve our recommendation systems and increase their accuracy over time."
Colin Klein, a senior lecturer in philosophy at the Australian National University who has researched online conspiracy theory communities, notes that Reddit's conspiracy forum even lists on its front page a number of conspiracy theories that turned out to be true.
"There's a kind of picture of the conspiracy theorist as sort of born fully formed and they connect everything to everything and have their tinfoil hat on," he said. "But a lot of people get into conspiracy theories through more narrow interests and start with things that are reasonable, or in many cases actually happened."
He notes conspiratorial thinking is virtually everywhere these days. "It's important to differentiate between people who are willing to entertain conspiracy theories and people who are going to go to the wall," he said.
"When you say conspiracy most people think of lizard people replacing world leaders, but once you start looking around you realise there's lots of conspiracy theorising we do all the time. Most of America thinks either that Trump had some kind of collusion with Russia or that (it's) a conspiracy by the media."
The problem, he says, is there is danger of going too far down the rabbit hole. He cites Harvard professor Cass Sunstein's concept of "crippled epistemology", a term he uses to refer to extremists who only get their information from a small number of inaccurate sources.
"You start distrusting ordinary sources of news, then it's hard to go back," Mr Klein said. "You say, 'I don't trust The New York Times, I'm going to start watching InfoWars.' It's a very hard process to roll yourself back from once you start distrusting those mainstream sources. That's the danger."
That's why, despite noting that many people engage with conspiracy theory content purely for entertainment, he believes YouTube's decision was ultimately the right one. "It's not just idly clicking around," he said.
"Once you go down the path to a certain degree, you're not literally stuck but it's hard to come back. I do have some worry about the kind of slippery slope argument, but I think it's probably an overall good move, a good proactive step."
There's a separate, but related reason he supports the change. "We know there are psychological effects where people will tend to weigh evidence in proportion to how much they hear about it," he said.
"You put a climate sceptic and a climate scientist on stage and people will think it's about 50-50, even though you have to go through about 99 scientists before you find the one that disagrees."
In other words, having endless 9/11 conspiracy theory videos show up in search results or in video recommendations means neutral observers "tend to get a mistaken view of the baseline, people tend to over-estimate how much dispute there is of various things".
"That's a well studied psychological phenomenon," he said. "Even Google search suggestions, if I type in 'do vaccines …', it comes up with ' … cause autism', ' … cause cancer'. People think, if these are the top Google hits it must be quite important."
"In some sense you're serving up other people's fears, devoid of context. It's a perpetual whack-a-mole kind of problem (for Google), but they do bear a responsibility."