ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

From E-Commerce to Digital Services: An Analysis of The New Rules for Platforms in the EU

Public Policy
Regulation
Internet
Social Media
Technology

Abstract

“Doing platform regulation” has for a long time been about multi-stakeholder procedures and agreements based on the signatories’ goodwill to address the challenges identified by civil society and experts. In general, lawmakers in the U.S. and in the E.U. have refrained from regulating platforms hosting user-generated content. Instead, due to intermediary liability provisions like Sec. 230 CDA or Art. 14 E-Commerce-Directive platforms were treated as content-neutral infrastructures, mere conduits. While the law was semi-blind to their content moderation activities, scholars have pointed out that platforms were de facto deciding upon what could or could not be communicated online (amongst others: Kuczerawy 2017; Gillespie 2018; Suzor 2019). Recently, at least in the EU, this perception has changed, and the call for hard laws or/and more effective regulation against harmful online communication that subsequently limits the platforms’ power over free speech has become louder. The new EU-Commission as well has identified this topic as a key issue and plans to change the platform liability rules for unlawful content. Only little is known so far about the planned EU Digital Services Act (DSA) but according to a leaked document (DG Connect Concept Note, April 2019), the horizontal framework (Latzer et al. 2013, p. 374) under the E-Commerce- Directive is ‘outdated’ and needs to be replaced by a more comprehensive set of rules for digital services. Regarding content moderation, the leaked note proposes to make uniform rules for the removal of illegal content binding across the EU and to possibly include harmful (not necessarily unlawful) content. On a more technical side, the authors suggest maintaining the ban on general content monitoring but to re-consider special provisions for filter technologies. Of course, it is so far unclear whether the propositions contained in this leaked note will eventually be put into the DSA. However, they could constitute the grounds for future policy and should, therefore, be monitored closely. Additionally, this development goes hand-in-hand with the observation that large social media platforms tend to adopt new structures that resemble administrative law – an uncommon development for non-state actors (Heldt 2019). This contribution aims to provide a policy analysis of recent regulatory developments in the EU, with the aim to ground a forward looking focus on the DSA, which could eventually replace the laws respectively adopted in individual Member States in the past years. Subsequently, it will bring together governance and legal scholarship to provide a normative analysis of the potential risks for freedom of expression and information, and assess whether the DSA will provide sufficient tools to enhance transparency and accountability. Given that upcoming forms of EU regulation could herald a new period for digital platforms and have indirect ripple effects on the rights of users worldwide, the paper will discuss the tradeoffs between legalization versus softer regulatory instruments when seeking effective control mechanisms and sufficient oversight for key digital public policy issues.