Asking specific targeted questions can overcome misreporting and bring about greater accuracy in post-election surveys

Researchers are dependent on high quality information arising from post-election survey data in order to best understand the attitudes and the social trends that shape election outcomes. But often, because of the stigma attached to not voting, some respondents misreport their behaviour, claiming erroneously to have voted. Here, Eva Zeglovitz and Sylvia Kritzinger show that by modifying the initial questions, this problem can be overcome. 

Credit: Ariel Nadel, CC BY SA 2.0

Credit: Ariel Nadel, CC BY SA 2.0

Scholars studying electoral behaviour are dependent on high quality survey data in the aftermath of an election. This is particularly the case when gathering information on electoral participation. As validated turnout information is not accessible in many countries, researchers have to rely on reported turnout stemming from surveys. Besides sampling and coverage errors, misreporting is one of the main reasons why reported turnout in surveys deviates from real turnout results.

Misreporting encompasses both memory failures and social desirability bias: this is that people report a behaviour which they believe is socially desirable but does not coincide with their actual behaviour. As voting is often regarded a socially desirable behaviour, some people hesitate to admit that they did not vote in an election and misreport their behaviour. The challenge is therefore to find a survey question format which reduces these sources of error in reporting turnout.

In our research note ‘New Attempts to Reduce Overreporting of Voter Turnout and Their Effects‘ published recently in the International Journal of Public Opinion Research we developed and tested different question formats to see whether and to which extent they can reduce the misreporting problem. We ran a survey experiment in a representative telephone survey in Austria in 2011, where we compared the standard question on turnout (a simple yes-no question) to two alternative questions formats. First, we took up the approach developed by Belli. The most significant difference to the traditional yes-no turnout question in electoral survey is that they diversify the response options to report nonvoting, listing different face-saving response options for non-voting. We thus asked our respondents the following:

“In talking to people about elections, we often find that a lot of people were not able to vote because they were sick, did not have the time, or were just not interested. Which of the following statements best describes you?*

1. I did not vote in the federal election in Sept 2008*
2. I thought about voting this time but didn’t*
3. I usually vote but didn’t this time*
4. I am sure I voted in the federal election in Sept 2008*
5. I voted by absentee ballot**

*Read aloud
**Not read aloud, but volunteered

While this format proved to reduce memory failure and misreporting in the US, it failed to do so in Israel.

Second, we developed a new form of diversification of response options that is meant to be used when the election in question took place a long time ago: it should facilitate respondents to say that they simply cannot remember if they participated in this particular election.

‘In this election, a lot of people could not vote or chose not to vote for good reasons. This election is some time ago now. Which of the following statements describes you best?’ [All read aloud]

1. I am sure I did not vote in the federal election in September 2008.
2. I am not sure if I voted but I think it is more likely that I did not.
3. I am not sure if I voted but I think it is more likely that I did.
4. I am sure that I voted in the federal election in September 2008.

We randomly assigned respondents to three treatment groups: each treatment group obtain a different question format on turnout. Given a benchmark of 78.8% official turnout in the Austrian election of 2008, the standard yes-no question format led to 85% reported turnout, and both alternative formats achieved approximately 81%. Hence, in contrast to the standard question, both alternative questions formats had lower self-reported turnout and were also not significantly different from the overall official turnout.

Most interestingly, though, is that we could observe a spill over effect on the follow up question on the electoral behaviour in the upcoming election when using the alternative question formats. Respondents who were asked the standard questions did not only report higher levels of turnout in the past but also higher levels of turnout in the future: there only 8% declared not to turnout to vote in the upcoming elections.

In contrast, this proportion was significantly higher (14% and 16%) among the respondents who had been asked the alternative questions beforehand. Apparently, once a question makes it easier to report non-voting in the last election, this effect seems to persist, as respondents are then more likely to declare that they will also not turn out in an upcoming election. Hence, the experimental treatment did not only affect the questions within the survey experiments but had also consequences on later survey questions.

We conclude that researchers planning surveys should carefully choose their turnout question formats, as they will not only affect the responses on the turnout question as such, but might also have spill over effects on the following questions. The question formats we tested in our experiment both seem promising to reduce over-reporting in the past and provide better forecasts for the future. For the AUTNES surveys which were conducted immediately after the federal election in September 2013, we implemented these findings and asked the question format adapted from the Belli et al approach: we diversified the possible response option reporting nonvoting.

Click here to download the AUTNES survey data

Note: this article represents the views of the authors and not those of Democratic Audit or the LSE. Please read our comments policy before posting. The shortened URL for this post is: https://buff.ly/VO0Bac

imageEva Zeglovits is an electoral researcher affiliated to the University of Vienna and the Austrian National Election Study AUTNES. She is one of the managers of IFES, a private Austrian company specialized on designing and conducting high quality surveys. Follow Eva on Twitter @EvaZeglovits

SylviaSylvia Kritzinger is professor at the Department of Methods in the Social Sciences, University of Vienna and co-principal investigator of the Austrian National Election Study AUTNES responsible for the voter studies.

 

Similar Posts