Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
World
Nick Evershed and Josh Nicholas

How much can polling really tell us about support for the voice to parliament?

Graphic of Parliament House
Polling consistently shows a divide along party voting lines, with Labor and Greens supporters more likely to say they’ll support the voice, while Coalition voters are more likely to be opposed. Illustration: Guardian Design

A Guardian Australia analysis of polling shows support has declined for the Indigenous voice to parliament, but it’s impossible to tell by exactly how much due to differences in how surveys are carried out.

The level of support in some states is also difficult to measure accurately and will remain so without an increased focus from pollsters on states that are underrepresented in polling.

Guardian Australia’s new voice poll tracker, an analysis that aggregates polls to determine the trend in support at the national level, shows overall support for the voice to parliament has declined since August 2022.

Polling experts and political scientists say measuring the specific level of support is tricky due to a number of issues, including polling companies asking different questions, polling companies allowing different types of responses, opaque methods, and low polling numbers in states such as Tasmania, South Australia and Western Australia.

These issues create uncertainty around the polling figures, says Simon Jackman, a political science professor at the University of Sydney.

“Poll averaging is difficult here, as unlike vote intention for a federal election – which has a bog standard question wording that everybody uses – here, even though we’ve got a very simple proposition, it varies in the way that pollsters have been asking it or giving people an undecided option,” he said.

“I think the one thing I look at is pretty clear, I think that [the yes vote] has shed some support and [the no vote] has gained some support since the start of the year. But again, because of those variations in question wording and whatnot, I would hesitate to put a precise number on it.”

In a detailed analysis of question type and response choice in polls on the voice, Prof Murray Goot from Macquarie University identified the difficulty of measuring support across multiple polls.

“More important than sampling error is non-sampling error – the error that flows from sources other than the variance inherent in sampling – including poorly worded questions, the effects of asking questions in a particular order, and inappropriate response options,” he wrote.

Even within polling companies, survey questions and methods have changed over time, such as a recent Newspoll survey which changed the wording of the question asked and the choices that respondents were offered.

The change in approach coincided with an increase in the number of people responding “don’t know” – from 8% in April to 11% in June – and a drop in the percentage of people intending to vote yes – from 53% down to 46%.

Question order can also produce different responses. The lowest level of support for the voice so far was in a poll commissioned by the Institute of Public Affairs (IPA) in December 2022, which put the percentage of respondents voting yes at only 28%.

However, Goot identified some key differences between the question asked in the IPA poll and others.

“The IPA (opposed to a voice) asked about ‘a separate entity of federal parliament, solely comprised of Indigenous Australians, to advise on laws for every Australian’, without mentioning a ‘voice’,” Goot wrote.

The question itself was preceded by another question asking whether it was “racist for someone to oppose the Indigenous-only voice to parliament”.

An earlier poll by the same polling company showed a yes vote of 65% and asked about a “representative First Nation’s body to advise the parliament on laws and policies affecting First Nations people, and referred to a First Nations voice.”

The headline level of support for the voice will also vary depending on whether a pollster allows respondents to answer with a response like yes, no and don’t know, or forces people to simply make a choice between yes and no.

A majority of majorities: the state issue

One of the unique features of a constitutional referendum is the requirement for a double majority. For the referendum to succeed, a majority of voters nationally need to vote yes. It also requires a majority in a majority of states, so four out of the six states must have a majority yes vote.

This presents another difficulty in tracking the likelihood of success of the voice. Few national polls so far have surveyed enough people to get a decently sized sample of voters from all of the six states, with South Australia, Western Australia and Tasmania in particular having low numbers.

Having a lower sample size from a state means pollsters will either not report the result at all, or the result will have a larger margin of error, meaning the precise level of support or opposition is harder to determine.

This is particularly clear in our chart tracking state support from the Essential poll, which shows the margin of error for each state – information that polling companies rarely publish:

Despite these issues, experts are hopeful that polling companies will rise to the occasion and increase polling in the smaller states.

“If it is business as usual in terms of polling, the outcome of the referendum will be hard to predict, because we’re all going to want to know what’s happening state by state,” Jackman said.

“Majorities in four states are a necessary condition for this thing to pass or fail. And so there are seven horse races going on, one national and six states.

“And I think some of those horse races aren’t going to be as close as others, and it’s the close ones that we want to keep an eye on. And that’s a bit of a challenge for the way polling was traditionally done by and for media organisations.”

As the referendum approaches, Jackman said interest in the outcome will increase, and so pollsters, and their media clients, may change their approaches.

“I would argue that this is a time for polling resources … to really think about a strategy whereby you’re extracting more sample than you ordinarily would out of smaller states like Tasmania and South Australia,” he said.

Social desirability bias?

Another issue in measuring support for the voice is that people may be reluctant to express opinions they see as controversial.

William Bowe, an election analyst and publisher of poll-tracking website The Poll Bludger, said there were other issues beyond survey methods and state numbers.

“I think the bigger imponderable is how accurate the polling is going to be, full stop – could there be social desirability bias going on here?” he said.

Bowe says in the same-sex marriage survey polling overestimated the margin of support, and that this may in part be due to people being reluctant to provide politically incorrect answers to these sorts of questions.

Polling consistently shows a divide along party voting lines, with Labor and Greens supporters more likely to say they will support the voice, while Coalition voters are more likely to be opposed.

A similar divide emerges in age groups, with younger people far more likely to say they will vote yes, and support dropping as the age of groups increases:

  • Guardian Australia will be regularly updating the voice poll tracking page throughout the referendum campaign. You can find out more about the methods and data sources we’re using here.

Here are some other projects or websites which are useful for following the polling:

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.