ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

Limits of the GDPR in Regulating Profiling and Algorithmic Decision-Making

Governance
Post-Structuralism
Power
Big Data
Michaela Padden
Karlstad University
Andreas Öjehag
Karlstad University
Michaela Padden
Karlstad University

Abstract

As policy makers seek more effective and efficient forms of service delivery, there lies great appeal, for example, in distributed networks of sensors, cameras and meters to generate big data for improved management of utilities such as transport and energy. In addition to these more technical solutions, the potential offered by big data for functions such as profiling and social scoring has seen an increase in automated decision-making by governments in areas such as education, social welfare, healthcare and justice. Private sector industries such as banking and insurance are also relying more on ‘non-traditional’ sources of data when making automated decisions. Big data analytics has spawned an industry of data broking, where firms assimilate information concerning our online activity, purchases and social media habits and then sell that data as analytic scores or classifications based on that data, which is in turn used by other companies to target or nudge us towards specific products and services, often without our direct knowledge. In this paper we present an analysis of the GDPR and key policy documents related to its instalment. We do this by adopting a “What’s the problem represented to be?” (WPR) approach as we seek to understand the political rationalities underpinning this policy, as well as ‘silences’ or ‘gaps’ in the GDPR with respect to potentially discriminatory or unfair practices. These assumptions in turn presuppose specific constructions of subjects and objects within the policy dynamic. The paper also aims to identify ‘silences’ or ‘gaps’ in the GDPR with respect to potentially practices such as profiling, nudging, predictive governance and algorithmic decisionmaking.