ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

A Neural Network for Scaling Party Positions from Texts combining Transformer and Supervised Dimension Reduction

Political Methodology
Political Parties
Political Ideology
Big Data
Hung H. V. Nguyen
Universität Bremen
Hung H. V. Nguyen
Universität Bremen

Abstract

Previous research in both Natural Language Processing (NLP) and political science has proven the superiority of the Transformer architecture and Transformer-based large language models (LLMs) compared to word frequencies and word embeddings on multiple text analysis tasks. This article introduces "ContextScale", a novel Transformer-based neural network tailored for party position scaling. ContextScale can predict complex political topics and scale text sequences on multidimensional left-right scales simultaneously. Besides a sizable boost in text representation coming from Transformer, ContextScale also benefits from a supervised dimension reduction step, one which reduces complex contextual embeddings produced from Transformer into meaningful position scores for political science research. Validation exercises prove the edges this approach have over existing methods. Moreover, ContextScale is easily extensible across languages, political domains, and theoretical models. A dataset of party positions for seventeen Western democratic societies is released along with the paper.