Algorithmic Selection and User (De-)Selection of Political Short Videos on TikTok: Evidence from a Four-Week Tracking Study Before the 2024 European Election
Campaign
Quantitative
Social Media
Communication
Mixed Methods
Technology
Empirical
To access full paper downloads, participants are encouraged to install the official Event App, available on the App Store.
Abstract
In political communication, short-video recommender systems (SRS) such as TikTok's For-You Page are often presented as central gateways to reach constituents. Prior research has largely focused on the supply side of political short-videos (Zamora-Medina et al., 2023), yet reaching users requires passing two barriers: algorithmic selection (content appearing in feeds) and user selection (decision to keep watching). Selection opportunities for political information through both incidental encounters and active selection decisions can vary based on personal characteristics like sociodemographics and political predispositions (Barnidge et al., 2021; Knobloch-Westerwick et al., 2020). In highly personalized environments where users primarily scroll past rather than actively choose content, such inequalities may compound (Shin & Jitkajornwanich, 2024). Even if content is encountered incidentally, this does not guarantee user selection or attention, a prerequisite for desired effects (Nanz&Matthes, 2022), and methodologically challenging to measure empirically challenging (Cho et al., 2025; Neijens et al., 2024). Consequently, we know little about whether and how intensively users receive political content in SRS (algorithmic selection: RQ1) and subsequently respond to selection opportunities (user selection: RQ2) based on personal characteristics.
We employ a four-week linkage design combining mobile in-app tracking and survey data from German TikTok users (N=411). During four weeks preceding the European Parliamentary Election (May-June 2024), participants' devices recorded every video entering their feed, yielding over 380,000 videos. Political content was identified via a systematically compiled list of 2,397 political accounts (parties, youth organizations, politicians), resulting in 326 political videos. Algorithmic selection was operationalized as the probability of encountering any political content and its frequency in feeds. User selection was operationalized as viewing duration: the time a video remained on screen before scrolling to the next, with deselection defined as skipping within ≤3 seconds. Personal characteristics were measured via survey, including sociodemographics (age, gender, education) and political predispositions (interest, ideology, party preference).
Algorithmic selection was estimated using zero-inflated negative binomial models, separating the probability of any political encounter from its frequency. Political content was rare and unevenly distributed (M=1.27 videos, SD=4.27), with many users receiving none. Centrist users encountered fewer political videos than left-leaning ones (β=0.87, p=.018), while women (β=1.59, p=.006) and less intensive TikTok users (β=-0.88, p<.001) were more likely to see no political content at all.
User (de)selection was examined using zero-inflated Gamma hurdle models distinguishing skipping (≤ 3seconds) from viewing duration. Women and politically centrist respondents were less likely to be exposed to any political content, women were also less likely to deselect it once exposed (β=-0.93, p=.026), whereas right-leaning users skipped more frequently (β=1.30, p=.028). Older age and medium education increased viewing duration; political interest shortened it. Party-incongruent content did not reduce viewing duration (β=0.02, p=.948); instead, videos from accounts neither congruent nor incongruent negatively impacted duration (β=-0.77, p=.023).
Political feeds differ systematically based on personal characteristics. Disparities emerge through both algorithmic and user selection. Crucially, these barriers often operate in opposing directions, suggesting that political actors seeking to reach constituents through SRS, passing the first barrier of algorithmic selection does not guarantee passing the second.