ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

Cognitive Warfare and Digital Authoritarianism by Proxy: Chinese State Media Amplification of Russian Frames of the War in Ukraine on YouTube and TikTok

China
International Relations
Media
Social Media
War
Narratives
Big Data
Influence
Julian Theseira
Charles University
Julian Theseira
Charles University

To access full paper downloads, participants are encouraged to install the official Event App, available on the App Store.


Abstract

Russia’s invasion of Ukraine has unfolded alongside platform governance and deplatforming. As platforms restrict, label, or remove some state-controlled outlets, influence efforts can persist through aligned secondary amplifiers that retain access and credibility. This paper examines cognitive warfare and digital authoritarianism by proxy by asking whether China’s foreign-facing state media, CGTN and Xinhua, amplify and adapt Russian state-media frames about the war in Ukraine on YouTube and TikTok. Bridging research on cognitive warfare, reflexive control, and authoritarian innovation and diffusion, I conceptualize proxy amplification as an adaptation to asymmetric platform constraints. When overtly partisan Russian outlets are restricted or stigmatized, a partner state’s international broadcasters can selectively reproduce, sanitize, or repackage Russian interpretive packages while maintaining a posture of neutrality. The mechanism is not simple repetition; it includes translation across languages, reframing through diplomatic tropes, and platform-native packaging (short video, captions, hashtags) that travels through recommendation systems. The analysis evaluates two expectations. First, if proxy amplification is occurring, Chinese state-media content should exhibit semantic convergence with Russian frames on key dimensions, including provocation and blame narratives; portrayals of NATO, Western military aid, and sanctions; depictions of escalation and civilian harm; delegitimation of Ukrainian agency; and “peace” narratives that relocate blame while foregrounding negotiations on terms favorable to Moscow. Convergence may be paired with neutrality signals (balanced sourcing, calls for restraint, or procedural emphasis) that increase perceived credibility. Second, if convergence reflects coordinated diffusion rather than parallel reactions to shared events, changes in Chinese frame prevalence should follow Russian frame launches with short lead-lag relationships, net of common event shocks. Empirically, I assemble two multilingual corpora spanning January to December 2025. Corpus 1 consists of CGTN and Xinhua posts about Ukraine on YouTube and TikTok, including captions, hashtags, and available transcript text from subtitles or auto-captions where feasible, plus engagement metrics (views, likes, shares) to approximate distribution. Corpus 2 is a cross-lingual reference corpus of RT and Sputnik website articles over the same period. To separate proxy amplification from generic international news salience, I include a benchmark set of non-state international reporting on the same events. Methodologically, the study combines inductive frame discovery with cross-lingual measurement of similarity and timing. TopicGPT and TURFTopic extract recurrent frames within each corpus, and multilingual contextual embeddings are used to construct outlet-level frame vectors. Convergence is quantified via embedding similarity and frame-prevalence overlap, while temporal alignment is assessed via cross-correlation and lead-lag regressions with event controls. The workflow includes stability checks across model specifications, targeted human validation of frame labels, and sensitivity analyses by language and platform format. The paper contributes to the Digital Authoritarianism agenda by specifying a portable concept and a transparent measurement strategy for transnational influence under asymmetric platform constraints. It highlights how authoritarian innovation and diffusion can operate through state-media alliances and platform affordances, and it clarifies why deplatforming can displace, rather than eliminate, cognitive warfare by shifting distribution to higher-credibility proxies. The findings speak to debates on platform governance, regulatory responses to state-linked media, and democratic information resilience.