Install this application on your home screen for quick and easy access when you’re on the go.
Just tap then “Add to Home Screen”
Install this application on your home screen for quick and easy access when you’re on the go.
Just tap then “Add to Home Screen”
Member rate £492.50
Non-Member rate £985.00
Save £45 Loyalty discount applied automatically*
Save 5% on each additional course booked
*If you attended our Methods School in the last calendar year, you qualify for £45 off your course fee.
Friday 2 March
13:00–15:00 and 15:30–17:00
Saturday 3 March
09:00–10:30 / 11:00–12:00 and 13:00–14:30
This course introduces maximum likelihood estimation (MLE), one of the most important methods for parameter estimation in modern social sciences, on which a large body of published empirical works relies.
You will learn the basic idea of MLE, and some of its extensions in several example cases.
In contrast to the least squares methods, MLE requires you to make some assumptions about distributions of stochastic terms. This course therefore also deals with basics of probability theory.
On the final afternoon you will write programs in R to estimate some model parameters.
Susumu Shikano is Professor of Political Methodology at the University of Konstanz. His research interests are spatial models of politics and various topics in political behaviour.
His work has appeared in journals including Public Choice, Political Psychology, Party Politics, West European Politics, and the British Journal of Political Science.
In modern social sciences, statistical analysis has long been established as an inference tool.
Although most published work relies on MLE for its statistical analysis, many researchers seem to be ignorant about MLE per se and rely instead on statistics software packages. Some interpret MLE results as if they were based on the least squares technique. Others cannot distinguish likelihood from probability.
This course, therefore, aims to deepen your understanding of how statistical packages use MLE. This will help you better understand inference, because MLE is closely related to the likelihood model of inference (King 1998).
We begin by discussing some important concepts:
Most importantly, we clarify likelihood and probability, and their distinct role in inference. We also look at foundations of probability theories, which many students learn only implicitly in conventional social science statistics lectures. These will help you understand the likelihood-based model of inference.
Using a linear regression example, you will also learn about computation of MLE (maximization algorithms) and their properties in inference statistics, in particular asymptotic ones.
After gaining a conceptual overview of MLE, you will learn how to program R to obtain MLE. R packages are available, so you won't have to program the maximization algorithms yourself, but you will learn how to define the likelihood function.
The course starts with a linear regression model and extends the classes of statistical model to discrete regression models, such as binary logit/probit and poisson models.
We cannot cover a wide range of statistical models in this short course, but after completing it, you should be able to apply the basic concept of MLE and program skills to various statistical models in different practical situations.
Statistics, including regression models with different types of dependent variables.
Basic knowledge of R.
Day | Topic | Details |
---|---|---|
Saturday Morning | Inference and MLE |
The concept of likelihood and probability, probability theory, the likelihood model of inference, Newton-type algorithms in MLE, properties of MLE, linear regression models |
Friday | Inference and MLE |
The concept of likelihood and probability, probability theory, the likelihood model of inference, Newton-type algorithms in MLE, properties of MLE, linear regression models |
Saturday Afternoon | Applied MLE |
Discrete regression models, R exercise |
Day | Readings |
---|---|
Friday |
Gary King, Unifying Political Methodology: The Likelihood Theory of Statistical Inference 1998, Chapters 1–4 Henning Best and Christof Wolf, eds. The Sage Handbook of Regression Analysis and Causal Inference Sage, 2014; Chapter 2: Martin Elff, Estimation techniques: Ordinary least squares and maximum likelihood |
Saturday Morning |
Henning Best and Christof Wolf, eds. The Sage Handbook of Regression Analysis and Causal Inference Sage, 2014; Chapter 2: Martin Elff, Estimation techniques: Ordinary least squares and maximum likelihood |
Saturday Afternoon |
Gary King, Unifying Political Methodology: The Likelihood Theory of Statistical Inference 1998, Chapter 5 |
Download R free if you do not already have it installed
Please bring your own laptop
Henning Best and Christof Wolf, eds. The Sage Handbook of Regression Analysis and Causal Inference Sage, 2014; Chapter 2: Martin Elff, Estimation techniques: Ordinary least squares and maximum likelihood
Gary King, Unifying Political Methodology: The Likelihood Theory of Statistical Inference Michigan University Press,1998
Introduction to Statistics
Advanced Topics in Applied Regression
Multiple Regression
Introduction to Bayesian inference
Multilevel Modelling
Panel Data Analysis
Advanced Discrete Choice Modelling