Articles for category: To Break Up or Regulate Big Tech? Avenues to Constrain Private Power in the DSA/DMA Package

Eyes Wide Open

The Digital Services Act must confront a gordian knot of fundamental rights and public interests with respect to various affected actors. To be effective, the new regulation must both consider the current reality of intermediary service provision and provide enough flexibility for future technological developments. It currently falls short of this aim.

Platform research access in Article 31 of the Digital Services Act

Over the past year, dominant platforms such as Facebook have repeatedly interfered with independent research projects, prompting calls for reform. Platforms are shaping up as gatekeepers not only of online content and commerce, but of research into these phenomena. As self-regulation flounders, researchers are hopeful for Article 31 of the proposed Digital Services Act, on “Data Access and Scrutiny” - a highly ambitious tool to compel access to certain data, but researchers also need a shield to protect them against interference with their independent projects.

Re-Subjecting State-Like Actors to the State

The Digital Services Act aims to limit the power of the Big Tech companies and to place more responsibility on them to control the content which is posted on their websites. Rather than providing even more power to the platforms via de facto self-regulation, the DSA should strengthen the interference opportunities of public authorities.

How to Challenge Big Tech

The European Commission's proposal for a Digital Markets Act is meant to complement EU competition law, in order to guarantee contestable digital markets. However, from a policy point of view, the current self-restriction to behavioural remedies in competition law and merger control, as well as the focus on behavioural ex ante regulation via the DMA, is at best a half-hearted and at worst a misguided way to effectively address the Big Tech challenge. We argue in favour of a competition law toolkit with extended options to use structural measures to tackle entrenched market dysfunctionalities.

Human Ads Beyond Targeted Advertising

If the bridling of harmful targeted advertising is a core objective of the DSA, the exclusion of influencer marketing is a grave oversight. Amendments introduced by the Internal Market and Consumer Protection Committee in the European Parliament may remedy this omission. If "human ads" were omitted, Big Tech platforms’ sophisticated data-related business models will continue to escape encompassing regulation and hence, their power will remain unchecked.

Enforcement of the DSA and the DMA

In trying to overcome the cross-border enforcement’s pitfalls of the GDPR, the Commission’s proposals for a Digital Services Act and Digital Markets Act are largely expanding the Commission’s enforcement powers. Unfortunately, what is touted as a solution for cross-border enforcement issues, might lead to new difficulties and challenges due to the risks of the centralization of power with the Commission.

Private Enforcement for the DSA/DGA/DMA Package

The package consisting of the Digital Markets Act, the Digital Services Act, and the Data Governance Act is about empowering authorities vis-à-vis powerful private market players. Private enforcement is absent in this package, despite its great potential: By engaging in rule enforcement, individuals and companies help to confine key market players’ (unlawful use of) economic power, while also counterbalancing a tendency for state agencies to become the sole decision makers on when and how to sanction what they consider undue conduct.

Why End-User Consent Cannot Keep Markets Contestable

A central source of Big Tech gatekeepers’ power is their encompassing access to individuals’ personal data. The prohibition of Article 5(a) of the proposed Digital Markets Act, therefore, is a welcome attempt to limit the private power over data held by gatekeeping platforms. However, end-user consent cannot be regarded as an adequate safeguard for keeping data-driven markets competitive.

General and specific monitoring obligations in the Digital Services Act

The Digital Services Act contains regulation that does not directly interfere with platforms’ freedom to operate but indirectly creates incentives for their handling of risk-aware behaviour, for example, towards personality right violations. Within the context of general and specific monitoring obligations in the Act, in particular, indirect regulation can encourage innovative and pragmatic decision-making, although further guardrails are necessary.

Using Terms and Conditions to apply Fundamental Rights to Content Moderation

Under EU law, platforms presently have no obligation to incorporate fundamental rights into their terms and conditions. The Digital Services Act seeks to change this in its draft Article 12, however, there has been severe criticism on its meagre protection. As it stands and until courts intervene, the provision is too vague and ambiguous to effectively support the application of fundamental rights.