ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

Back to Paper Details

Measuring Political Participation – A Question Wording Experiment

Mikael Persson
University of Gothenburg
Mikael Persson
University of Gothenburg
Maria Solevid
University of Gothenburg
Open Panel

Abstract

Political participation is inconsistently measured in the major surveys used by political scientists. This is the first study to experimentally compare the outcomes of the rival measurement instruments applied in five major international surveys (CSES, WVS, ESS, CID and ISSP). Since both the question wordings and the response options offered differ between the major surveys, uncertainty surrounds to what extent the item construction bias the results; do different item constructions estimate different levels of political participation? For voting we know that self-reported levels in surveys are overestimated due to the fact that some respondents are embarrassed to admit that they did not vote. However, it is uncertain if this pattern exists for other forms of political participation as well due to the inability to compare self-reported levels of political participation with any objective data. In order to provide an answer to this problem we use an embedded web-survey experiment to compare the political participation items used in the CSES, WVS, ESS, CID and ISSP. We manipulate both question wording and response options to gauge the effects of each component in a full factorial design. As for question wording, some of the international surveys include an introduction to the question expressing the social desirability of political participation, while other attempt to reduce over reporting by eliminating social desirability pressure and instead include an introduction that normalizes inactivity. Moreover, the major surveys differ also in the response options given. The time constraints as to when the different acts of political participations were performed vary and ranges from ever to the past 12 months. In addition, some of the surveys use only a dichotomous response option (yes and no) while other uses response options that also estimate potential participation. Hence, we combine three different question wordings ((a) social desirability, (b) normalizing inactivity and (c) a control group without any introduction to the question) with three different response options (different combinations of time constraints and actual/potential participation response options) resulting in a full factorial design including nine treatments in total. Results shows that the question wording indeed affects the responses given. We find significant differences in levels of political participation reported by respondents who were subject to question introductions that normalize inactivity or express social desirability. However, the response options given also significantly affect the results. Presumably due to the sporadic nature of some forms of political participation we find large differences in the proportion of individuals who stated that they have performed different acts when different time limits were applied. Also, providing a social desirability introduction significantly increases the amount of individuals stating that they would potentially be politically active compared to when an introduction that normalize inactivity were given.