If you specifically look for reasons why cannabis is bad you're more likely to find them. Photo / Getty Images
COMMENT:
In unveiling the details of the cannabis and euthanasia referendums this week, Justice Minister Andrew Little said he hoped that early release of the information would allow the public to participate in the process and make informed decisions at next year's election.
This sounds great in theory, but logic,reasoning and statistics will have little to do with the way most of us vote on either of these matters.
Studies in behavioural science have shown that our decision-making processes are deeply flawed and rarely driven by the rational side of the mind. Instead, they're influenced by a range of unseen biases and the lies we tell ourselves to justify the decisions we make.
Advertisers have long understood this. That's why so much money is spent every year trying to develop marketing campaigns that make us laugh, cry or feel part of something bigger. The forces that push us towards buying $200 sneakers are often the same as those that determine our views on more profound issues.
While we like to think we'll make a rational decision in the polling booth in November next year, we are all prisoners of the unseen forces that drive our decisions. In the end, we'll be influenced by which side can tell the best story.
Early polling has shown strong partisan lines forming on the cannabis issue, with those on the Left leaning towards "Yes" and those on the Right towards "No". It would be statistically remarkable for this divide to correspond so perfectly with political lines if every individual was processing and making a decision independently.
The reality is that this is more likely to be a product of two cognitive biases we all suffer from: the bandwagon effect and confirmation bias.
The bandwagon effect occurs when we tend to believe something if we think it has already been accepted by others. So the more folks in your filter bubble who share articles about why cannabis is bad, the more likely you are to start holding the same view.
Confirmation bias, meanwhile, sees us reading more articles and opinions that justify our existing point of view. This is only made easier by the Google search bar, which allows us to home in our searches on exactly what we want them to say. If you type "cannabis ruins lives" into the search bar, you're likely to find plenty of sources that say just that.
The problem with this is that simply showing people the other side of the story isn't going to change their minds. A 2017 study by researchers at Duke and Princeton universities in the US showed that Republicans become more conservative and Democrats more liberal when made to read opinions from the other side on contentious issues.
The reason is that those opinions are often couched in the language and morality of the opposing side, making it unpalatable to the reader. Vox writer Brian Resnick notes that other studies have shown the only way around this issue is to reframe an argument in the morality of the opposing side. To use cannabis as example, the Left in New Zealand would stand a better chance of persuading the Right if it shifted the debate from the social harm caused by imprisoning people for using the drug, to something focused on personal freedom and the business opportunities cannabis reform could deliver.
What this comes down to is that the best storytellers in the media will ultimately pull the debate their way. The problem, though, is that the best storytellers aren't always the most knowledgeable people when it comes to these issues – and this brings us to the Dunning-Kruger Effect and the anecdotal fallacy.
We will all have seen a Dunning-Kruger at some point. This is the name given to the bias in which people believe they know far more about a topic than they really do. Ironically, the limits of their knowledge mean they can't identify the shortcomings in their reasoning, and therefore automatically believe their view is superior. When these people are given a vast platform and allowed to deliver their views, they can be incredibly persuasive to those who are looking for easy answers to complex issues.
In lieu of actual evidence, these people will often convince their readers with carefully cherry-picked examples that back their view. The person who tells you they're not worried about their smoking habit because they had a tobacco-puffing uncle who lived until 96 is a classic example of Dunning-Kruger and an anecdotal fallacy working in tandem. You might slap them with all the data in the world, but they've seen the truth and there's no turning back.
So do we really want to leave such important and highly contentious issues to the flawed decision-making we all suffer from? Do we really want to risk the possibility that the Dunning-Krugers of New Zealand might influence large swathes of the population?
This is not an argument against the democratic process, but rather a suggestion that sometimes it's better to trust the experts to inform a decision and then make it.
Imagine, for instance, if significant historical issues such as female suffrage or the abolition of slavery had been put to a referendum. Would we still be waiting for the right decision?