Two opinion polls have come out at once saying dramatically different things about the current fortunes of New Zealand's political parties. One poll, promoted by Newshub as "epic", puts Labour 13-points ahead of National. The other poll from 1News showed a shock reversal of party fortunes, with Labour now2-points behind. And once you add in the role of the minor parties, the difference between the two pictures painted by the polls is bizarre and hard to reconcile. But I have three possible explanations.
This has caused a lot of head-scratching and some interesting reactions and attempts to explain the discrepancy. I went on RNZ's Morning Report to put forward some explanations – you can listen to my five-minute interview here: NZ political polls: Which one is right?.
I suggest that there are three main possible explanations for the bizarre discrepancies: 1) general polling methodologies have become unreliable, 2) the time period in which the two surveys were undertaken might have produced different results, and 3) one of the results might simply be a "rogue poll", which happens occasionally, even for reputable market research companies.
I argued that of the two polls, it seems more likely that the 1News Colmar Brunton poll is the aberration, simply because it's less in line with the previous trends, in which Labour's polling had been improving and National's declining. Of course, there's no way of knowing and, in fact, both polls could be rogue. Therefore, possibly the best that can be taken from the situation is to average the two polls, which produces a result of: Labour: 46.4, National: 40.7, Greens: 6.1, and NZ First: 3.9.
All partisans are going to have their own view of which of the two poll results to believe. Unsurprisingly, the National Party deputy leader has quite blatantly indicated that she doesn't like the message in the Newshub poll, going on the AM Show today to say: "I just don't think that that's true. I just don't believe your numbers… I'm sorry, I really don't believe the numbers you put out last night are a true reflection of where it's all at... It's way better than that" – see: 'I just don't believe your numbers': Paula Bennett in denial after horror poll.
The same article gives an explanation of the methodology of Newshub's polling company: "Bennett suggested the methodology Reid Research uses could be to blame. Unlike most other polls, a quarter of Reid Research's sample is found via the internet – making it easier to reach younger demographics. 'It's a certain demographic that do the online polling', said Bennett. But all legitimate pollsters weights the results they get to reflect the demographics of the voting public, regardless of how they're sourced."
According to Claire Trevett, National MPs will now be insisting on seeing the results of the party's own commissioned internal polling, which can then be used as a "tie-breaker" to determine which of the two public polls is accurate: "When National's caucus meets tomorrow, MPs will want to see the results of the party's own polling, by Curia Market Research. Those results may be critical in determining Bridges' immediate future. They are delivered to the leadership team every Wednesday, but MPs are not shown the results every week. They are shown them once every sitting period. This week marks the start of this sitting period. If Bridges does not volunteer those results, MPs will – quite fairly – demand to see them" – see: Do polls chime tick-tock for Simon Bridges?.
National's chief pollster, David Farrar, suggests a "rogue poll" is likely: "the results are dramatically different. They are so far apart, that statistically it can't be margin of error… You basically can't reconcile these polls. One (or both) of them seem to be outside the 95% confidence interval, i.e. is the 1 in 20 'rogue' result" – see: A tale of two polls.
Farrar also points out the potentially important timing of the two polls: "The only other plausible explanation is that as the [1News] poll started a few days after [Newshub], Labour had a massive drop in support after those first few days".
The Spinoff's Alex Braae expands on this: "the Newshub poll started being conducted just before the Budget and teacher strike, and the One News poll started a few days after. There's a small possibility those events swayed significant numbers of voters away from the government. But it's a huge reach, and would also probably require both being at the extreme ends of their respective margins of error. Apart from that it's possible differences in methodology, sampling or weighting has played a role, but we can't say that for sure" – see: New polls bring joy, confusion for political obsessives.
Braae also makes the important point that the poll discrepancy "is a really nice example of how polls are effectively just a single sampled snapshot in time, and because of that no individual polls should be taken as gospel."
Nonetheless, given the huge discrepancy in the poll results, increased distrust about polling is likely to result. After all, in other parts of the world recently there have been some well-known examples of polling being in disrepute. As media-specialist Damien Venuto says today, "These questions are particularly pointed in the aftermath of the Colombian referendum, Brexit, the Trump victory and the recent Australian election – all notable examples of pollsters getting their predictions wrong" – see: Why polls get it so wrong so often (paywalled).
Venuto puts special emphasis on the traditional problem of "shy Tory syndrome" in which members of the public are disinclined to tell pollsters of their unfashionable voting intentions. On this, he quotes Colleen Ryan, of research firm TRA: "People are very poor at telling the truth… This doesn't mean that they're lying, but they do say things that they think others would like to hear".
He also raises the question of whether the public or the media are becoming too-poll-driven: "Should we really be firing politicians on the basis that they haven't done well on a poll?... If politicians are consistently looking to appeal to the masses and win points in polls, there's a real risk that they'll lose the hearts of the key constituents they actually need to appeal to."
In this regard, it's also worth noting the advice of newspaper columnist Damien Grant, who recently argued that the polls are "Overanalysed and self-fulfilling" and "The whims of the electorate are as erratic and inexplicable as the sleeping patterns of a new-born. We shouldn't try to read too much into them" – see: Polls are as effective as chicken entrails to divining the will of the people.
Here's his main point: "We act as if they have meaning and so they have meaning. The commentariat is all a twitter over the latest jump in support for Labour and the corresponding slump for National and NZ First. It's noise. If a commentator knew what would cause a poll result, they could have predicted it. They didn't. The last election was a case study in the pointlessness of relying on polls, commentators or indeed elections as a guide to who will seize the levers of power. Our electoral process is, and has always been, a random-number generator with pundits trying to ascribe rational explanations to white noise."
According to Mike Williams, "It's difficult that landlines are dying out, it's hard to get hold of cellphone numbers, some of these people are now doing online polling". But he suggests that it might be time to go back to more traditional market research methods: "The only [solution] I can think of, and it's really expensive, is to actually go back to face-to-face polling, going back 30 to 40 years of the Heylen poll."
Of course, there are no real alternatives to using polling to gather the public mood. There might now be some greater interest in finding out what Labour and National's own internal polling says, but reports of these should always be taken with more than a grain a salt. Henry Cooke reports: "The internal polls are more trusted, but don't seem to be agreeing either. Labour's is understood to have National below 40 and themselves in the high 40s, while National's is understood to have them neck with the trend in their favour" – see: Duelling polls offer good news for both main parties, bad news for Simon Bridges.
Finally, it's always worth retaining some scepticism about how all these polls are reported. After all, sometimes the numbers involved are badly conveyed and poorly contextualised. And what about the crucial number of "don't knows" and "don't cares" that are hardly mentioned by the opinion poll stories? For more on all this, from the previous round of opinion poll debate back in February, see Colin Peacock's TV political poll hype hits new heights.