ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

Speaking Carefully: Comment Editing as a Behavioral Indicator of Political Self-Censorship

Political Violence
Internet
Social Media
Political Activism
Burak Giray
Technical University of Munich
Burak Giray
Technical University of Munich
Ilker Kalin
Matej Bel University in Banska Bystrica

To access full paper downloads, participants are encouraged to install the official Event App, available on the App Store.


Abstract

Understanding how citizens adjust their online expression in environments with constrained political communication remains a central question in digital politics. This study examines patterns of self-censorship on social media by analyzing how users modify their comments when interacting with politically sensitive content. Focusing on Turkey—an electoral authoritarian context with well-documented digital monitoring practices—we analyze comment-editing behavior on YouTube to assess whether users adapt their public expression when discussing politically salient issues. Using the YouTube Data API, we constructed a dataset of user comments posted under a curated set of videos. The sample includes videos addressing politically sensitive topics, such as government-related debates and protest events, alongside apolitical control videos selected to match in upload timing and engagement levels. A distinctive feature of the YouTube platform is its “edited” flag, which denotes whether a comment has been revised after initial posting. Although prior versions of comments are not accessible, this metadata offers a behavioral indicator of reconsideration, allowing us to infer potential self-censorship patterns without accessing private user information. Our analysis focuses on two outcomes: (1) whether users are more likely to edit their first comment when responding to politically sensitive videos, and (2) whether edited comments differ in sentiment from comments that remain unedited. The results reveal clear contextual patterns. First, users commenting on politically sensitive videos display a significantly higher probability of editing their initial comment compared to users on control videos. This suggests that individuals may exercise greater caution or self-monitoring when participating in discussions they perceive as more delicate or potentially contentious. Second, sentiment analysis indicates that edited comments tend to exhibit more positive sentiment than comments that were never edited. Although the content of earlier drafts cannot be recovered, the observed shift suggests that users who choose to revise their remarks may be softening or moderating the tone of their contributions before allowing them to be publicly displayed. Together, these findings provide empirical evidence that the political context of online conversations can shape how citizens present themselves on social media. By leveraging subtle behavioral traces contained in platform metadata, this study demonstrates that even minimal observational indicators—such as whether a comment has been edited—can reveal meaningful patterns in political communication. More broadly, the analysis highlights how perceived political sensitivity can influence everyday online interactions, shaping not only whether individuals speak but how they calibrate the tone and visibility of their public expression.