Install this application on your home screen for quick and easy access when you’re on the go.
Just tap then “Add to Home Screen”
Install this application on your home screen for quick and easy access when you’re on the go.
Just tap then “Add to Home Screen”
Monday 29 July – Friday 2 August
09:00–10:30 / 11:00–12:30
This one-week course provides a thorough introduction to the most widely used methods of causal inference (also known as program evaluation). This is an applied methods course, with equal emphasis on methods and their applications.
The target audience is practitioners and would-be practitioners of policy evaluation who want to measure the effects of policy interventions and policy changes. We especially recommend the course to researchers in social sciences (PhD students and young scholars) interested in measuring the effect of interventions and policy changes.
ECTS Credits for this course and below, tasks for the additional credits
1 additional credit Pass a quiz in advance each day.
2 additional credits As above, plus complete a take-home paper.
Dániel Horn is a research fellow at the Centre for Economic and Regional Studies of the Hungarian Academy of Sciences, and associate professor at the department of Economics, Eötvös Loránd University in Budapest.
Besides economics courses he has taught statistics, introduction to Stata and different public policy design and evaluation courses for over five years at PhD, MA and Bachelor levels.
He has been conducting educational impact assessment for over a decade. His research areas include education economics, social stratification and educational measurement issues.
This one-week course provides a thorough introduction to the most widely used methods of casual inference, also known as program evaluation. This is an applied methods course, with equal emphasis on methods and their applications.
The target audience is practitioners and would-be practitioners of policy evaluation who want to measure the effects of policy interventions and policy changes. We especially recommend the course to researchers in social sciences (PhD students and young scholars) interested in measuring the effect of interventions and policy changes.
The knowledge and skills developed in this course are essential to understand and effectively participate in policy evaluations. They are necessary for anyone who works with the results of policy evaluations: those who commission evaluations and want to judge feasibility and methodological soundness of competing proposals, and those who use the results of evaluations and want to know the validity and reliability of their results.
The ideal student has some statistical background and is not afraid of working with data but does not have a firm methodological background in program evaluation or econometrics.
The course provides up-to-date methodological knowledge, together with the corresponding intuition and related skills in handling software, interpreting results and presenting those results.
By the end of this course, you should be able to:
During the course, I will:
Methods covered
Day 1 – Introduction to impact evaluation
Why do we need to evaluate? Identifying the counterfactual. Getting the question right and formulating hypotheses. Causal inference versus association. The potential outcomes framework. An example: The Job Corps program in the US Heterogenous Treatment Effects. Average Treatment Effects.
Day 2 – Randomised Control Trials (RCT)
Intro to RCT. Why does randomisation help? Randomisation in Practice. Examples. (The NSW program in the US; PROGRESA in Mexico). Intent to treat effects. Imperfect compliance. Local average treatment effects
Day 3 – RCT & Matching
RCT regressions. Intro to matching: Unconfoundedness. The Common support. Exact matching.
Day 4 – Linear Regressions & Matching
Simple linear regression. Ensuring common support. Functional form choices. Regression versus matching. Matching on the propensity score. Examples in matching. Teacher training in Israel.
Day 5 – Matching
Examples in matching. (The New Deal for Lone Parents programme in the UK; debate based on the NSW program in the US). Student presentations.
Basic knowledge of statistics and regression analysis at an undergraduate level, and some knowledge of Stata or R.
Each course includes pre-course assignments, including readings and pre-recorded videos, as well as daily live lectures totalling at least two hours. The instructor will conduct live Q&A sessions and offer designated office hours for one-to-one consultations.
Please check your course format before registering.
Live classes will be held daily for two hours on a video meeting platform, allowing you to interact with both the instructor and other participants in real-time. To avoid online fatigue, the course employs a pedagogy that includes small-group work, short and focused tasks, as well as troubleshooting exercises that utilise a variety of online applications to facilitate collaboration and engagement with the course content.
In-person courses will consist of daily three-hour classroom sessions, featuring a range of interactive in-class activities including short lectures, peer feedback, group exercises, and presentations.
This course description may be subject to subsequent adaptations (e.g. taking into account new developments in the field, participant demands, group size, etc.). Registered participants will be informed at the time of change.
By registering for this course, you confirm that you possess the knowledge required to follow it. The instructor will not teach these prerequisite items. If in doubt, please contact us before registering.
Day | Topic | Details |
---|---|---|
1 | Introduction to impact evaluation |
Lectures |
2 | Randomised Control Trials (RCT) |
Lectures |
3 | RCT & Matching |
Lectures Takehome for day 5. Replicating LaLonde 1986. |
4 | Linear Regressions & Matching |
Lectures Takehome for day 5. Using LaLonde 1986 to analyse the effect of NSW |
5 | Matching |
Lectures, student presentation |
Day | Readings |
---|---|
1 |
Impact Evaluation in Practice (World Bank, 2011), CH1-4. |
2 |
Imbens and Wooldridge 2009. |
3 |
LaLonde, Robert. 1986. Impact Evaluation in Practice (World Bank, 2011), CH7. |
4 |
Impact Evaluation in Practice (World Bank, 2011), CH7. |
5 |
Dehejia, R., Wahba, S. 1999. Jeffrey Smith and Petra Todd 2005 |
Stata or R.
Please bring your own laptop with software installed.
Compulsory texts
Impact Evaluation in Practice (World Bank, 2011)
Imbens and Wooldridge, “Recent Developments in the Econometrics of Program Evaluation.” Journal of Economic Literature, 47(1), 2009, pp. 5-86.
LaLonde, Robert. 1986. “Evaluating the Econometric Evaluations of Training Programs with Experimental Data.” American Economic Review 76 (September): 604–20.
Dehejia, R., Wahba, S. 1999. Causal effects in noexperimental studies: Reevaluating the evaluation of training programs. Journal of the American Statistical Association 94(448), 1053-1062.
Jeffrey Smith and Petra Todd 2005 Does Matching Overcome Lalonde's Critique of Nonexperimental Estimators? Journal of Econometrics Vol 125 Issue 1-2.
Optional texts
Deaton, Angus. “Instruments of Development: Randomization in the tropics, and the search for the elusive keys to economic development.” January 2009. Princeton mimeo. We will discuss section 4.
Diamond, Alexis and Jasjeet S. Sekhon(2012) “Genetic Matching for Estimating Causal Effects: A General Multivariate Matching Method for Achieving Balance in Observational Studies.” Review of Economics and Statistics
Duflo, Esther, Rachel Glennerster and Michael Kremer (2007) Using Randomization in Development Economics Research: A Toolkit CEPR Working Paper 6059. Sections 2.1 and 2.2.
Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
Iacus, Stefano M., Gary King, und Giuseppe Porro. (2011) „Causal Inference without Balance Checking: Coarsened Exact Matching“. Political Analysis.
Imbens, Guido W. 2003. Sensitivity to exogeneity assumptions in program evaluation. American Economic Review 96:126–32.
Joshua Angrist and Victor Lavy 2001 Does Teacher Training Affect Pupil Learning? Evidence from Matched Comparisons in Jerusalem Public Schools. Journal of Labor Economics Vol 19. No. 2.
Morgan, Stephen L. and Christopher Winship. Counterfactuals and Causal Inference: Methods and Principles for Social Research.
Peter Dolton and Jeffrey A. Smith The Impact of the UK New Deal for Lone Parents on Benefit Receipt. IZA DP No. 5491
<p style="text-align:left"><strong>Summer School</strong></p> <p style="text-align:left">Introduction to Stata</p> <p style="text-align:left">Multiple Regression Analysis: Estimation, Diagnostics, and Modelling</p>
<p style="text-align:left"><strong>Summer School</strong></p> <p style="text-align:left">Causal Inference in the Social Sciences II</p> <p style="text-align:left">Introduction to Experimental Research in the Social Sciences</p>