Install this application on your home screen for quick and easy access when you’re on the go.
Just tap then “Add to Home Screen”
Install this application on your home screen for quick and easy access when you’re on the go.
Just tap then “Add to Home Screen”
Monday 29 July – Friday 2 August and Monday 5 – Friday 9 August
09:00–10:30 and 11:00–12:30
This course approaches qualitative case studies from the perspective of method and practice. Our goal is to understand the advantages and challenges of the case study method and to detail the tasks involved in all stages of the research process.
The course has three interrelated components:
By the end of the course, you will be able to implement sound case studies and to critically evaluate published research.
Please note The course is about case studies as a tool for the generation of causal inferences. The course is not about case studies in the hermeneutic, interpretive, etc. tradition eschewing causal terminology and does not discuss actual data collection in much detail, e.g. issues related to the preparation of interviews or archival research.
'The Case Study Research class was a real eye-opener for me. By improving my research design and methods skills, it helped me immensely with my thesis. It also gave me the right tools to provide constructive feedback to colleagues. I cannot recommend this class enough, especially to young scholars just starting their PhD journey!' — Angelos Angelou, London School of Economics and Political Science
ECTS Credits for this course and below, tasks for additional credits
6 credits As above, plus pass a take-home exam, details and deadline for which will be set during the course. I will give you three published case studies (journal articles), of which you have to choose one, and answer five method-related questions (including several sub-questions) on the text.
8 credits As above, plus submit a 10–15 page research proposal applying the lessons learned on this course to your own research project (detail your concept, justify your case selection, your cross-case comparison, etc). If you don't have a research project, you will be given an equivalent task. The paper has to be sent to me within four weeks of the end of the course (exact deadline to be confirmed).
You must also complete the small assignments, which is less work than it sounds, because many are based on discussions and work that we will do in class, and which refer to your own ongoing research project.
Ingo Rohlfing is Professor of Methods of Empirical Social Research at the University of Passau
He researches social science methods with a focus on qualitative methods (case studies and process tracing), Qualitative Comparative Analysis and multimethod research.
Ingo is author of Case Studies and Causal Inference (Palgrave Macmillan) and he has published articles in Comparative Political Studies, Sociological Methods & Research and Political Analysis.
Case studies have a long tradition in the social sciences. They have been subject to a great deal of methodological appraisal and criticism.
This course focuses on its methodological and practical dimensions. It is useful for participants at every stage of their research, but those at the beginning will gain the skills to carefully plan their study. If you are in the midst of your analysis, you can evaluate your existing case study in the light of what you learn here and plan your next steps to meet the standards of good case study research.
Day 1
We introduce three important dimensions central to all case study analyses:
Day 2
We discuss causation and causal inference, and when and how we claim that an observed empirical association reflects a causal relationship. We introduce the criterion of difference-making as the benchmark for inferring causal relationships. We elaborate on the distinction between causal effects and causal mechanisms and their role in causal inference and case studies.
A thorough treatment of causal inference requires a consideration of different notions of causal effects. A distinction is made between correlations (e.g., the more X, the more Y) and set-relations (e.g., if X, then Y) as these currently represent the two major perspectives on causal effects in the social sciences. Finally, we introduce the basics of Bayesianism, as it became increasingly important in the recent literature about case studies and process tracing.
As will be seen in the subsequent sessions, the means of implementing a case study may significantly depend on the causal effect that one deems to be in place. One major goal of this session is to encourage you to reflect on the specific type of causal relationship you believe to be in place and to formulate your theoretical expectations accordingly.
Day 3
We discuss the importance of delineating the population underlying your case study (provided you aim to generalise). You will see that the specification of the population requires three elements: the definition of scope conditions guiding your analysis, and the positive and negative conceptualisations of the phenomenon that you want to explain in your case study.
We start with the distinction between different types of scope conditions and their implications for empirical research. Then we clarify the distinction between the positive and negative outcome and its role for drawing the boundaries of the population.
Days 4 and 5
These are concerned with two related issues. First, we discuss different types of cases, including the typical case, the deviant case, and the most-likely case, and their suitability for answering different research questions. Second, we consider case selection strategies for each type of case. A discussion of both will help you determine what type of study and case selection strategy you need to find answers to your own research question.
Since there is a menu of types and case selection strategies and both are linked to different modes of causal inference (see Day 2), we will spend two days on these topics.
Day 6
We begin with a session on the basics of comparative case studies, including a consideration of John Stuart Mill’s (in)famous method of difference and method of agreement. The session focuses on the benefits and limitations of cross-case comparisons in the light of the existing arguments for and against such designs. These criticisms can be subsumed under the rubric ‘small-n problem’, stating that the number of cases is too small in case studies to generate valid causal arguments.
This session will help you understand the construction of proper comparisons and avoid making common mistakes in comparative case studies.
Day 7
We cover advanced issues in comparisons, detailing multiple strategies to mitigate the problems we learned about on Day 6. These strategies include the realisation of within-unit and longitudinal comparisons, an increase in the number of cases, binary measurement of causes (as opposed to multi-categorical measurement), and the transformation of causes into scope conditions.
Day 8
I explain the basics of within-case analysis and process tracing in particular. We relate the idea of a causal mechanism to process tracing, discuss different variants of process tracing and within-case analysis, and how process tracing might help diminish some problems one confronts in comparative case studies.
Day 9
We continue with process tracing from a more practical perspective. I introduce different types of source (primary sources, interviews, etc.) with a focus on their respective advantages and disadvantages. You will become better aware of the problems involved in fact-gathering and the need to handle the collected information with caution.
Based on the discussion of sources and evidence and types of cases (see Days 4 and 5), we elaborate on how to systematically use evidence for causal inference. Specific attention is paid to the logic of Bayesian causal inference (Day 3) and we introduce a step-by-step procedure for making sound causal inferences based on observations.
Day 10
We cover the generalisation of causal inferences. The opportunities and limits for generalisation are discussed in combination with techniques for extending insights beyond the cases under scrutiny. Case study researchers are often criticised for engaging in generalisation, so it is important to put these problems to the fore.
Finally, a Q&A session lets you ask resolve any open-ended questions that remain.
What this course is, and is not, about
The course is useful for participants who…
► Want to generate causal inferences for one or multiple cases. Whether or not you intend to generalise your inferences is of secondary importance because this is only one element of the research process (and this course).
► Want to learn the case study method and the careful construction of case study designs. The course has a strong practical element because you will learn tricks and clues for avoiding common mistakes in empirical research, i.e., the practice of case studies. But:
All methodological discussions will be supplemented with example case studies from different subfields in political science. I expect you to complete the obligatory readings and contribute to in-class debate.
Before the course starts, I’ll send you a short questionnaire on your research project, prior methodological expertise, and expectations regarding the course.
Required reading in advance of this course
Yin, Robert K. (2008) Case Study Research: Design and Methods Thousand Oaks, California: Sage Publications
Lange, Matthew (2012) Comparative-Historical Methods SAGE: Los Angeles
Each course includes pre-course assignments, including readings and pre-recorded videos, as well as daily live lectures totalling at least three hours. The instructor will conduct live Q&A sessions and offer designated office hours for one-to-one consultations.
Please check your course format before registering.
Live classes will be held daily for three hours on a video meeting platform, allowing you to interact with both the instructor and other participants in real-time. To avoid online fatigue, the course employs a pedagogy that includes small-group work, short and focused tasks, as well as troubleshooting exercises that utilise a variety of online applications to facilitate collaboration and engagement with the course content.
In-person courses will consist of daily three-hour classroom sessions, featuring a range of interactive in-class activities including short lectures, peer feedback, group exercises, and presentations.
This course description may be subject to subsequent adaptations (e.g. taking into account new developments in the field, participant demands, group size, etc.). Registered participants will be informed at the time of change.
By registering for this course, you confirm that you possess the knowledge required to follow it. The instructor will not teach these prerequisite items. If in doubt, please contact us before registering.
Day | Topic | Details |
---|---|---|
1 | Introduction to essentials of the case study method |
Lecture (~90 min): Course goals Essentials and concepts central to the case study method (as discussed in this course) Dimensions of case study research Levels of analysis, causal effects and causal mechanism Lab (~60 min): Dimensions in participants’ research |
2 | Causation and modes of causal inference |
Lecture (~100 min): Associations and causal inference Causal effects: correlation vs. set relations Basics of Bayesianism
Lab (~60 min): What are causal effects and causal mechanisms in participants’ research? How can participants infer causation in their study? |
4 | Types of cases and case selection |
Discussion of assignment from day 3 (~45 min) Lecture (~90 min): Characteristics of types of cases (typical, deviant, most-likely etc.) Selection rules for different types of cases Lab (~45 min): Reflection on general case selection strategies |
3 | Concepts and specification of population |
Discussion of assignment from day 2 (~30 min) Lecture (~100 min): Scope conditions Positive concepts, negative concepts, and continua Lab (~60 min): Identification of scope conditions and concepts in participants’ projects |
5 | Types of cases and case selection |
Discussion of assignment from day 3 (~45 min) Lecture (~90 min): Characteristics of types of cases (typical, deviant, most-likely etc.) Selection rules for different types of cases Lab (~45 min): Identification appropriate type of case and selection rule in participants’ projects |
6 | Within-case analysis: method & practice |
Assignment/lab (~90 min, done in class): Identifying observations in empirical research Lecture (~90 min): Process tracing & collecting observations Tying evidence to concepts and inferences Pros and cons of different types of sources |
7 | Cross-case comparisons: advanced issues |
Discussion of assignment from day 6 (~45 min) Lecture (~120 min): Units of analysis and time in comparisons Multi-case comparisons Binary and categorical measurement in comparisons Role of scope conditions |
8 | Within-case analysis: method & practice |
Assignment/lab (~90 min, done in class): Identifying observations in empirical research Lecture (~90 min): Process tracing & collecting observations Tying evidence to concepts and inferences Pros and cons of different types of sources |
9 | From observations to inferences |
Discussion of assignment from day 8 (~45 min) Lecture (~140 min): Unique and contradictory inferences |
10 | Generalisation, summary and Q&A |
Lecture (~90 min): Strategies of generalising causal inferences Summary of course Q&A (~90 min) |
Day | Readings |
---|---|
1 |
Rohlfing, Ingo (2012): Case Studies and Causal Inference: An Integrative Framework. Basingstoke: Palgrave Macmillan: chap. 1. Voluntary Gerring, John (2004): What Is a Case Study and What Is It Good For? American Political Science Review 98 (2): 341-354. |
2 |
Brady, Henry A. (2008): Causation and Explanation in Social Science. Box-Steffensmeier, Janet M., Henry Brady and David Collier (eds.): The Oxford Handbook of Political Methodology. Oxford: Oxford University Press: 217-226. Rohlfing, Ingo (2012): Case Studies and Causal Inference: An Integrative Framework. Basingstoke: Palgrave Macmillan: sections 2.3 and 2.4. Rohlfing, Ingo (2012): Case Studies and Causal Inference: An Integrative Framework. Basingstoke: Palgrave Macmillan: chap. 8. (will be online) Voluntary (causation and causal inference) Machamer, Peter, Lindley Darden, and Carl F. Craver (2000): Thinking about mechanisms. Philosophy of Science 67 (1): 1-25. Machamer, Peter (2004): Activities and Causation: The Metaphysics and Epistemology of Mechanisms. International Studies in the Philosophy of Science 18 (1): 27-39. Gerring, John (2005): Causation: A Unified Framework for the Social Sciences. Journal of Theoretical Politics 17 (2): 163-198. Hedström, Peter and Petri Ylikoski (2010): Causal Mechanisms in the Social Sciences. Annual Review of Sociology 36 (1): 49-67. Lebow, Richard Ned (2010): Forbidden Fruit: Counterfactuals and International Relations. Princeton: Princeton University Press: chap. 2. Goertz, Gary and James Mahoney (2012): A Tale of Two Cultures: Contrasting Qualitative and Quantitative Paradigms. Princeton: Princeton University Press: chap. 2. Voluntary (Bayesianism) Bennett, Andrew (2010): Process Tracing and Causal Inference. Brady, Henry E. and David Collier (eds.): Rethinking Social Inquiry: Diverse Tools, Shared Standards. Lanham: Rowman & Littlefield: 207-219. Beach, Derek and Rasmus Brun Pedersen (2012): Process Tracing Methods. Ann Arbor: University of Michigan Press: chaps. 5, 6. Rohlfing, Ingo (forthcoming) Comparative hypothesis testing via process tracing Sociological Methods & Research Rohlfing, Ingo (2013): Bayesian causal inference in process tracing. SSRN working paper (http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2301453) |
3 |
Goertz, Gary (2006): Social Science Concepts: A User's Guide. Princeton: Princeton University Press: 27-53. Ragin, Charles C. (2000): Fuzzy-Set Social Science. Chicago: University of Chicago Press: chap. 2. Voluntary Sartori, Giovanni (1970): Concept Misformation in Comparative Politics. American Political Science Review 64 (4): 1033-1053. Walker, Henry A. and Bernard P. Cohen (1985): Scope Statements - Imperatives for Evaluating Theory. American Sociological Review 50 (3): 288-301. Adcock, Robert and David Collier. 2001. "Measurement validity: A shared standard for qualitative and quantitative research." American Political Science Review 95 (3):529-546. |
4 |
Rohlfing, Ingo (2012): Case Studies and Causal Inference: An Integrative Framework. Basingstoke: Palgrave Macmillan: chap. 3. (will be online) Voluntary (types of cases) Lijphart, Arend (1971): Comparative Politics and the Comparative Method. American Political Science Review 65 (3): 682-693. Eckstein, Harry (1975): Case Study and Theory in Political Science. In Greenstein, F.I. and N. W. Polsby (eds): Strategies of Inquiry. Handbook of Political Science, vol. 7. Reading: Addison-Wesley: 92-123. Skocpol, Theda (2003): Doubly Engaged Social Science. In Mahoney, James and Dietrich Rueschemeyer (eds): Comparative Historical Analysis in the Social Sciences. Cambridge: Cambridge University Press: 407-428. Voluntary (case selection) King, Gary, Robert O. Keohane and Sidney Verba (1994): Designing Social Inquiry: Scientific Inference in Qualitative Research. Princeton: Princeton University Press: 128-148. Collier, David and James Mahoney (1996): Insights and Pitfalls: Selection Bias in Qualitative Research. World Politics 49 (1): 56-91. Seawright, Jason and John Gerring (2008). Case-Selection Techniques in Case Study Research: A Menu of Qualitative and Quantitative Options. Political Research Quarterly 61 (2): 294-308. |
5 |
Same as day 4 |
6 |
Rohlfing, Ingo (2012): Case Studies and Causal Inference: An Integrative Framework. Basingstoke: Palgrave Macmillan: chap. 4. Voluntary Lijphart, Arend (1971): Comparative Politics and the Comparative Method. American Political Science Review 65 (3): 682-693. Lieberson, Stanley (1991): Small Ns and Big Conclusions: An Examination of the Reasoning in Comparative Studies Based on a Small Number of Cases. Social Forces 70 (2): 307-320. Savolainen, Jukka (1994): The Rationality of Drawing Big Conclusions Based on Small Samples: In Defense of Mill's Methods. Social Forces 72 (4): 1217-1224. Lieberson, Stanley (1994): More on the Uneasy Case for Using Mill-Type Methods in Small-N Comparative Studies. Social Forces 72 (4): 1225-1237. George, Alexander L. and Andrew Bennett (2005): Case Studies and Theory Development in the Social Sciences. Cambridge: MIT Press: Chap. 8. Mahoney, James (1999): Nominal, Ordinal, and Narrative Appraisal in Macrocausal Analysis. American Journal of Sociology 104 (4): 1154-1196. Tarrow, Sidney (2010): The Strategy of Paired Comparison: Toward a Theory of Practice. Comparative Political Studies 43 (2): 230-259. Slater, Dan and Daniel Ziblatt. 2013. The Enduring Indispensability of the Controlled Comparison. Comparative Political Studies 46 (10): 1301-1327 |
7 |
Rohlfing, Ingo (2012): Case Studies and Causal Inference: An Integrative Framework. Basingstoke: Palgrave Macmillan: chap. 5. Voluntary George, Alexander L. (1979): Case Studies and Theory Development: The Method of Structured, Focused Comparison. In Lauren, Paul Gordon (ed): Diplomacy. New Approaches in History, Theory, and Policy. New York: Free Press: 43-68. Mahoney, James (2000): Strategies of Causal Inference in Small-N Analysis. Sociological Methods & Research 28 (4): 387-424. George, Alexander L. and Andrew Bennett (2005): Case Studies and Theory Development in the Social Sciences. Cambridge: MIT Press: chap. 9. Gerring, John, and Rose McDermott (2007). An Experimental Template for Case Study Research. American Journal of Political Science 51 (3): 688-701. Anckar, Carsten (2008): On the Applicability of the Most Similar Systems Design and the Most Different Systems Design in Comparative Research. International Journal of Social Research Methodology 11 (5): 380-401. |
8 |
Bennett, Andrew and Jeffrey Checkel (2014): Process Tracing: From Methodological Roots to Best Practices. Bennett, Andrew and Jeffrey Checkel (ed.): Process Tracing in the Social Sciences: From Metaphor to Analytic Tool. Cambridge: Cambridge University Press: 1-37. Voluntary Collier, David, Henry Brady and Jason Seawright (2004): Sources of Leverage in Causal Inference: Toward an Alternative View of Methodology. In: Brady, Henry and David Collier (eds): Rethinking Social Inquiry. Diverse Tools, Shared Standards. Rowman & Littlefield: Lanham: 250-264. Checkel, Jeffrey T. (2008): Process tracing. A. Klotz and D. Prakash (eds.): Qualitative methods in international relations: a pluralist guide. Basingstoke: Palgrave Macmillan: 114-127. Gerring, John (2008): The Mechanismic Worldview: Thinking inside the Box. British Journal of Political Science 38: 161-179. Hall, Peter A. (2008): Systematic Process Analysis: When and How to Use It. European Political Science 7 (3): 304-317. Beach, Derek and Rasmus Brun Pedersen (2013): Process-Tracing Methods. Ann Arbor: University of Michigan Press: chap. 1. |
9 |
Romney and Weller. Metric Scaling : correspondence analysis, Sage Publications, 1990, pages : 17-26 (compulsory) ; 55-70 (compulsory); 85-90 (compulsory). Jörg Blasius, Victor Thiessen. Assessing Data Quality and Construct Comparability in Cross-National Surveys. European Sociological Review , 22 (3), July 2006, pp. 229–242 (recommended). |
10 |
Rohlfing, Ingo (2012): Case Studies and Causal Inference: An Integrative Framework. Basingstoke: Palgrave Macmillan: chap. 9. Voluntary Rueschemeyer, Dietrich (2003): Can One or a Few Cases Yield Theoretical Gains? In Mahoney, James and Dietrich Rueschemeyer (eds) Comparative Historical Analysis in the Social Sciences. Cambridge: Cambridge University Press: 305-332. Ruzzene, Attilia (2012): Drawing Lessons from Case Studies by Enhancing Comparability. Philosophy of the Social Sciences 42 (1): 99-120. Schatz, Edward and Elena Maltseva (2012): Assumed to be Universal: The Leap from Data to Knowledge in the American Political Science Review. Polity 44 (3): 446-472 |
None.
None.
In addition to the reading list above
Lange, Matthew (2012) Comparative-Historical Methods SAGE: Los Angeles.
Levy, Jack S. (2008) 'Case Studies: Types, Designs, and Logics of Inference'. Conflict Management and Peace Science 25: 1-18.
Van Evera, Stephen (1997) Guide to methods for students of political science Ithaca: Cornell University Press.
Yin, Robert K. (2008) Case Study Research: Design and Methods Thousand Oaks, California: Sage Publications.
Research Designs
Introduction to QCA
Expert Interviews
Introduction to Qualitative Data Analysis with Atlas.ti
Introduction to NVivo for Qualitative Data Analysis
Advanced Multi-Method Research
Advanced Process-Tracing
Expert Interviews
Introduction to Qualitative Data Analysis with Atlas.ti
Introduction to NVivo for Qualitative Data Analysis
Advanced QCA