ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

Human-centric AI: Hybridity and non-human agency in algorithmic governance

Governance
Public Policy
Technology
Tero Erkkilä
University of Helsinki
Konstantinos Kostas
University of Helsinki
Tero Erkkilä
University of Helsinki
Konstantinos Kostas
University of Helsinki

Abstract

‘Human-centric AI’ has become a core concept of responsible algorithmic governance, whereby the concept of human-centricity is used to empower people amidst increasing automation of society. Digital transformation is associated with the complexification of governance, with questions such as how to regulate automation technologies due to the ‘black-box’ nature of AI. Moreover, AI governance entails a shift in forms of control because automation is blurring human agency in public decision-making, raising the question of accountability. The notion of ‘human-centric AI’ is prevalent in most AI governance strategies, from Finland, Singapore to the European Union, in which human-centricity is proposed as a remedy for perceived issues of accountability. In these governance strategies, human-centricity is portrayed as a design principle with which to break the path dependency from ‘organization-centricity’. We analyze human-centricity as an instance of hybridity, implying public and private actors as well as human and non-human agency. Our analysis shows paradoxical features in the current employment of human-centricity in algorithmic governance. For example, instead of enhancing accountability, human-centricity is used as an argument for designing seamless rapid service allocation in public governance for the individual citizen or removing organizational boundaries and ‘breaking organizational silos’ (both public and private), hence blurring accountability relations.