Articles for tag: automated-decision makingDSGVOKISchufa

To Score Is to Decide

Can the act of assigning a score to someone constitute a decision? This, in essence, is the question the Court of Justice of the European Union (CJEU) had to answer in Case C-634/21. And the Court’s answer is yes, following in the footsteps of the Advocate General’s opinion on the case. Rendered on 7 December, this ruling was eagerly awaited as it was the first time the Court had the opportunity to interpret the notorious Article 22 of the General Data Protection Regulation (GDPR) prohibiting decisions “based solely on automated processing".

Europe’s Digital Constitution

In the United States, European reforms of the digital economy are often met with criticism. Repeatedely, eminent American voices called for an end to Europe’s “techno-nationalism.” However, this common argument focusing on digital protectionism is plausible, yet overly simplistic. Instead, this blog post argues that European digital regulations reflect a host of values that are consistent with the broader European economic and political project. The EU’s digital agenda reflects its manifest commitment to fundamental rights, democracy, fairness, and redistribution, as well as its respect for the rule of law. These normative commitments, and the laws implementing those commitments, can be viewed in aggregate as Europe’s digital constitution.

Rechtsgut Datenschutz?

Während materielle Schadensersatzansprüche für Datenschutzverletzungen in der Praxis eine untergeordnete Rolle zu spielen scheinen und verhältnismäßig einfach festzustellen und zu beziffern sind, bereitet die in Art. 82 DSGVO vorgesehene Ersatzfähigkeit immaterieller Schäden den Gerichten Kopfzerbrechen. Eine richtungsweise Entscheidung zu immateriellen Schadensersatzansprüchen für DSGVO-Verletzungen fällte der EuGH Anfang Mai 2023 in der Rechtssache C‑300/21. Es ist das erste Urteil aus einer langen Reihe an Vorabentscheidungsersuchen zur Auslegung des Art. 82 DSGVO. Nach wie vor interpretationsbedürftig bleibt jedoch, wie ein immaterieller Schaden nun konkret festzustellen und zu bemessen ist. Nach einer kurzen Zusammenfassung der Kernaussagen des EuGH befasst sich dieser Beitrag daher mit diesem praxisrelevanten Problem und möchte – insbesondere unter Berücksichtigung etablierter Instrumentarien der deutschen und österreichischen Rechtspraxis – Lösungswege für die mitgliedstaatlichen Gerichte aufzeigen.

How the Platform Work Directive Protects Workers‘ Data

The Commission's proposal of the new platform labour directive came with a core promise to platform workers in the EU: to recognize the impact algorithmic management has on their working conditions. In doing so, the directive seeks to clarify and strengthen data rights of workers, regardless of whether they are classified as employees or not. This blog post argues that the main achievement of the proposed Directive is to clarify and reframe existing norms about automated decision-making in a way that shifts attention from data to working conditions. While the specific proposed provisions do not go far beyond norms already established in the General Data Protection Regulation, they are reframed in a way that clarifies that digital labour platforms have the responsibility to ensure fairness, transparency and accountability when making decisions that rely on algorithms.

Competition law as a powerful tool for effective enforcement of the GDPR

It looks like a good week for data protection. On Tuesday, the Commission presented a new proposal for a Regulation on additional procedural rules for the GDPR, and a few hours later, the ECJ published its decision C-252/21 on Meta Platforms v Bundeskartellamt (Federal Cartel Office). While the Commission's proposal to improve enforcement in cross-border cases should probably be taken with a pinch of salt, the ECJ ruled on some things with remarkable clarity. The first reactions to the ruling were quite surprising; few had expected the ECJ to take such a clear stance against Meta's targeted advertising business model. It does however represent a consistent interpretation of the GDPR in the tradition and understanding of power-limiting data protection.

The GDPR’s Journalistic Exemption and its Side Effects

On 25 May 2023, we mark the fifth anniversary of the General Data Protection Regulation’s (GDPR) full application in the European Union (EU). While the Regulation is primarily known for its impact on business, it also fostered significant changes to data processing by media outlets, which are often overlooked in discussions about data protection. This blog post analyzes what is commonly called the ”journalistic exemption” under Article 85 of the GDPR that requires Member States to regulate the extent to which GDPR applies to journalists and others writing in the public interest. Further, this contribution reflects on how exactly that journalistic exemption is implemented across the Member States, and considers the problematic consequences of the GDPR’s uneven application to the media sector, including instrumentalization of GDPR in the strategic litigation (SLAPPs) against journalists.

Squaring the triangle of fundamental rights concerns

Ex ante, the July 2022 ruling by the Court of Justice of the EU on Passenger Name Records had a very specific scope — the use of passenger name records by government agencies. Upon closer inspection, however, it has important implications for the governance of algorithms more generally. That is true especially for the proposed AI Act, which is currently working its way through the EU institutions. It highlights, ultimately, how national, or in this case European, legal orders may limit the scope for international regulatory harmonization and cooperation.

Automated predictive threat detection after Ligue des Droits Humains

The Ligue des droits humains ruling regarding automated predictive threat detection has implications for the European Travel Information and Authorisation System (ETIAS) Regulation and the EU Commission’s proposal for a Regulation on combating online child sexual abuse material (CSAM). Both legal instruments entail the use of potentially self-learning algorithms, and are spiritual successors to the PNR Directive (the subject of Ligue des droits humains).

EU Privacy and Public-Private Collaboration

Core state functions, such as law enforcement, are increasingly delegated to private actors. Nowhere is this more apparent than in the development and use of security technologies. This public-private collaboration harbours detrimental consequences for fundamental rights and the rule of law; in particular, for the principle of legality. The policy outcomes which result from this collaboration are not democratically accountable, and allow human rights to be superseded by private, profit-driven interests.

Challenging Bias and Discrimination in Automated Border Decisions

In Ligue des droits humains, the Court of Justice of the European Union explicitly addresses the fact that the use of AI and self-learning risk models may deprive data subjects of their right to effective judicial protection as enshrined in the Charter. The importance of this judgment cannot be understated for non-EU citizens and at the European borders more generally.