Web-based probability panels with open calls for questions (such as the GESIS panel in Germany, the LISS panel in the Netherlands, and the UAS and TESS panels in the US), as well as online microwork platforms (such as Amazon’s Mechanical Turk and Clickworker.de) are increasingly popular tools for social scientists with limited resources to field survey questions. Probability panel organizations provide for representative samples and often offer questionnaire design and testing services, whereas microwork platforms offer inexpensive access to professional, and thus presumably high quality, respondents. While survey methodologists are familiar with the methodological merits and challenges of online probability platforms and microwork platforms in comparison to traditional forms of survey research (Blom, Gathmann, & Krieger 2015; Bosnjak et al., 2017; Callegaro & Disogra 2008; Shank 2015, Lutz 2016, Majima 2017), the relative merits of these two platforms compared to each other have not yet been explored, in particular, with regard to potentially sensitive questions that might suffer from social desirability.
On the one hand, panel surveys are thought to reduce social desirability in part due to increased comfort with the survey administration. Yet, validity tests both longitudinally and between panel participants and cross-sectional respondents may be confounded by question-level effects (Crutzen & Göritz 2010; Uhrig 2011; Halpern-Manners et al. 2017; Toh et al. 2006; Binswanger et al. 2013). On the other hand, the separation of panel administration and survey administration in samples recruited from microwork platforms enhances anonymity and privacy. Anonymous surveys have been shown to increase the reporting of sensitive behaviors and attitudes (O’Malley et al., 2000; Durant, Carey, & Schroder 2002; van de Looij-Jansen, Goldschmeding, & de Wilde 2006; Brown & Vanable 2009). As of yet, these studies have focused on school children and young adults and have not addressed political attitudes. The extent to which reduction in social desirability of sensitive political behaviors in online panel surveys are comparable to those in online microwork platforms are, therefore, still less well understood.
To examine the effect of panel participation versus anonymity on responses to sensitive political questions, we implemented a vignette experiment in a single wave of the German Internet Panel (GIP) about support for family reunification of refugees to explore attitudes towards refugees. To test the effect of social desirability bias within a panel setting, we conducted the same vignette experiment with an anonymous sample recruited through the microwork platform Clickworker.de. We collected the same data from our anonymous sample and use matching methods (e.g., Ho et al. 2007) to achieve balance between the respondents on relevant observed covariates. The results of our study provide important insights into the level of bias in the inferences researchers draw from panel studies.