Cross-Border Data Forum Bannner

German Court Decision Signals Move Towards Risk-Based Approach to Data Transfers

, ,

German Court Decision Signals Move Towards Risk-Based Approach to Data Transfers

This article discusses a July decision of the Regional Court of Traunstein (Germany) that shows a more flexible approach to data transfers from the EU to the U.S. than has been taken by European Data Protection Agencies (DPAs). As we also discuss in a shorter new article published by the IAPP, this decision contains several important findings and reflects a risk-based approach to Chapter V of the General Data Protection Regulation (GDPR), which governs transfers of personal data to third countries. Importantly, the approach adopted by the Traunstein court contrasts with the “zero-risk” approach to data transfers adopted by DPAs since the European Court of Justice’s Schrems II decision in 2020.

In this article we discuss the EU DPAs “zero-risk” approach to data transfers, before analysing the  findings of the Regional Court of Traunstein. We also discuss other decisions of national courts, concluding that the Traunstein decision appears to be the latest judicial decision by a Court in an EU Member State that adopts a more flexible approach towards the interpretation of Chapter V of the GDPR..

  1. The EU DPAs “Zero-Risk” approach to data transfers

Since the European Court of Justice’s Schrems II decision, European privacy regulators have adopted what has been described as a “zero risk” approach to data transfers enforcement. DPAs have been asking data controllers and processors transferring personal data outside the EU to “eliminate” all risks of access to such data by the intelligence and law enforcement agencies of foreign countries whose legal systems do not include data protection legal safeguards in this field that are essentially equivalent to those mandated by EU law. As a result, they have interpreted Chapter V of the GDPR as limiting the transfer of data in a readable format to countries that do not meet the European Data Protection Board’s (EDPB) “European Essential Guarantees” (EEG) requirements, in cases where there is even a theoretical risk of access by intelligence or law enforcement authorities. Simultaneously, and this is the second dimension of the “zero-risk” theory, DPAs have urged EU-based companies to refrain from utilising service providers that localise data in the European Economic Area (EEA) but may be subject to foreign laws.

This approach led to a ramp-up of enforcement actions by EU privacy regulators against US companies. Following a binding decision by the EDPB, the Irish Data Protection Commission (DPC) imposed a record fine of 1.2 billion Euros on Meta on May 22, 2023 in relation to transfers of data to the US. More recently, on August 26, 2024,  the Dutch Data Protection Authority (DPA) imposed a fine of 290 million euros on Uber for its own data transfers of European drivers’ data to the US during the period before the new July 2023 adequacy decision.

The enforcement of this “zero risk” approach was also directed against European entities that relied on US companies for digital services. After Schrems II, Noyb filed 101 complaints, against the use of Google Analytics and Facebook Connect integrations in the webpages of EU controllers. European companies targeted by these complaints defended by saying that the challenges were to low-risk audience measurement data and that this analytics data was of low interest to U.S. or other intelligence services. Google, for instance, stated that it had received “0 requests for such data in the 15 years in which Google Analytics has been offered”. Nonetheless, European DPAs rejected focused on the purely theoretical risk of access to analytics data by US authorities. DPA’s reached more that 12 decisions (discussed here pp. 21-27) condemning such companies for the use of Google Analytics or other services involving data transfers to the US, including a fine of one million euros imposed on Tele2 pronounced by the Swedish DPA on June 30, 2023, just ten days before the EU Commission found transfers of personal data adequate under the EU/U.S. Data Privacy Framework. Even after the adoption of the adequacy decision, DPAs continue to issue decisions condemning European companies for using Google Analytics in the period before its adoption, as shown by the Telenor decision issued by the Norwegian Data Protection Authority (Datatilsynet) on July 26 2023.

  1. The Traunstein court decision

A German plaintiff (whose name was redacted according to standard German procedure) sued a social media network, alleging violations of the General Data Protection Regulation (GDPR).  The social network’s name is also redacted but various parts of the decision suggest it was Meta; for convenience, this article will simply use “Meta” to refer to the defendant.

The plaintiff raised a number of GDPR causes of action, one of which was that Meta’s transfers of its users’ data to servers in the US should be found unlawful, both before and after the new adequacy decision based on the EU-US Data Privacy Framework,. He argued that the United States did not offer an EU-equivalent level of protection for his data because once in the US, data can be disclosed to the NSA on a “warrantless” (anlasslos) basis.  He also argued that since he did not consent to the transfer of his data to the US, it should have remained in Europe. The plaintiff requested monetary damages, and also asked the court to enjoin Meta from further transferring his data to the US.

The court rejected all the plaintiffs’ claims and dismissed his lawsuit.  In doing so, the court took a multifaceted approach to evaluating whether the transfers at issue were lawful.  It considered a number of factors outside the bare question of whether the US agencies could theoretically access the plaintiff’s data.  We summarize below the court’s more salient points in finding that Meta’s transfers of user data to the US were lawful while making a few comments about the importance of these findings.

     a) The inevitability and lawfulness of transfers of content published by a user in a worldwide social network

As stated above the plaintiff argued that Meta should not transfer to the US the content he was publishing in his account, because there was a risk of access by US intelligence services. The Court rejected this argument noting that Meta is a social media network designed as a “global” platform. The Court emphasized that:

“A global social network based in the USA cannot be accused of unlawfully transferring data to the USA. If the social network is designed as a global platform, data must inevitably be exchanged internationally in order to maintain the global network”.

The Court added that all users of such worldwide networks should obviously be aware that the data they post on such a network will be transferred to other countries where the network operates, including the US. “Searching for users in other jurisdictions can only work if there is a cross-border exchange of data”, noted the Court. Since “[a]ll of this is readily known by every [Meta] user, including the plaintiff,” the court considered that the Meta’s transfers were known to the user and lawful.

This is an important finding and, based on our research, appears to be the first time that a court in the EU has focused in such a way on the fundamental issue of how global social networks operate.

As a matter of fact, Meta, like X, LinkedIn, TikTok, YouTube and all other global platforms, are worldwide networks hosting user-generated content (text, photos, short videos, etc), where data subjects have the autonomy to make their content publicly accessible to all or accessible to their “friends” worldwide. For this content to be available internationally, it is imperative to transfer the data across borders.

Users who opt to make their content publicly available —through public accounts without restricting post visibility— allow their profile information and user content (including text, photographs, videos, audio recordings, livestreams, comments, and hashtags) to be accessed globally, irrespective of the viewer’s account status and even if the viewer does not have a social media account. This necessitates the international transfer of such information to ensure global accessibility and interaction.

Conversely, users with private accounts can restrict content access to their friends/followers only. Nonetheless, to accommodate followers located in or traveling to different countries, these contents must also be internationally transferred to remain accessible.

Similarly, user interactions such as reactions (“likes”) or comments must be globally transferred to ensure they are visible to the intended international audience.

In the “Zero Risk Fallacy” report, one of the authors of the present article argued that DPAs should depart from their formalistic and absolutist approach to data transfers and “recognize that Chapter V does not Mandate Degrading Essential Digital Services in the EU”. The report had discussed extensively the complex nature and functionality of global social media and had concluded the following:

“DPAs should acknowledge that a proportionate approach to Chapter V does not preclude data transfers initiated and sought by individuals themselves and which are indispensable in order to permit to exercise other rights proclaimed by the EU Charter of Fundamental Rights, such as freedom of expression and information. Specifically, when users seek to share posts on social networks and interact with a global audience, how can this be achieved without transferring data beyond EU borders? How would social media platforms provide their services in the EU if there is no logical, conceptual or technical way for them to function without transferring such personal data? Take the case of Meta, which faced a staggering 1,2 billion Euros fine for data transfers to the US. Neither the EDPB nor the DPC decisions clarified how social media services like Facebook could continue to operate seamlessly in the EU and globally without these transfers. The fundamental question to ask is: How can EU users engage with a worldwide community on social media without internationally exchanging data? […] Striking a balance between safeguarding data against the risk of foreign government access and preserving the functionality of indispensable online services is paramount to maintaining a connected and functional digital landscape” (here pp.89-90).

The Traunstein decision acknowledges for the first time this important issue and opts for a pragmatic approach based on the reality of how global social networks function and the legality of data transfers initiated in such networks by individuals themselves. At the same time the Court adopted a risk-based approach concerning government access to data as we will now see.

     b) The mere existence of a generalized, undefined risk arising from US foreign intelligence programs is not sufficient to make transfers unlawful

The court next addressed the plaintiff’s contention that, “as the plaintiff ultimately alleges, [Meta] makes its entire database freely available to the US foreign intelligence service without any preconditions.”  In making this argument, the plaintiff seems to have suggested that the US government has “admitted” Meta makes its “entire database” available to the NSA for foreign-intelligence programs (such as the Section 702 FISA programs) with “no conditions attached” (voraussetzungslos), placing any transfers to the US subject to a risk of access by the NSA – and thus unlawful.

The court held the plaintiff had failed to show any evidence that Meta made its entire database available to US agencies or that the US government had “admitted” anything to this effect. At the same time, the court did not address the existence of US foreign intelligence programs that could theoretically request access to the plaintiff’s data, or the “substantial anxiety and stress” the plaintiff claimed that the prospect of such programs accessing his data had caused him to suffer. Thus, the court at least implicitly acknowledged that the mere existence of US foreign intelligence programs – and the generalized, undefined risk to data they may create – was not sufficient to consider transfers of content published by a user in a worldwide social network unlawful.

     c) Consumers do not have a default expectation that data should be stored in the EU.

The court next rejected the plaintiff’s argument that Meta needed to obtain his consent before transferring his data out of the EU. It similarly rejected the argument of the plaintiff that Meta had an obligation to store the European users’ data in Europe. The Court stated:

“The plaintiff has no right to demand that [Meta’s platform] is operated in such a way that all data is stored and processed in Europe in the sense of a purely European [Meta platform].” 

Instead, the location of data storage is a “business” or “operational” decision (unternehmerische Entscheidung) the court suggested was within the discretion of the company holding the data.  Here, since Meta elected to store data in the US, its business decision was “to be accepted by the users,” particularly since users are not compelled to use the platform.

     d) The Data Privacy Framework’s redress mechanism, even if based on executive order, is based on a “law” and thus adequate.

As we have noted earlier, the plaintiff did not only challenge the transfers of his data to the US during the period before the adoption of the new EU/US adequacy decision in July 2023, but also after this date.

The plaintiff noted that Meta was relying on the EU-US Data Privacy Framework (“DPF”) as the mechanism on the basis of which it transfers EU data to the US.  Part of the DPF is a new redress mechanism, the Data Protection Review Court (DPRC), that enables EU residents to exercise their right to an effective remedy.  The DPRC was created by Executive Order 14,086 and US Department of Justice regulations. The plaintiff argued that since the DPRC is created by executive order, not by statute, it does not confer a level of remedies equivalent to what is available in the EU.

The Court rejected this argument noting that, to the extent that “data is currently transferred on the basis of the Commission’s adequacy decision of 10 July 2023 […] a further review of the adequacy of the level of protection is therefore unnecessary”.

The Court noted, nonetheless, that the argument raised by the plaintiff cannot be accepted anyway. It emphasized that:

“An [executive] regulation is also a law in the substantive sense. It is not clear why this cannot provide equivalent legal protection.”

Once again this is an important statement by the Court. The argument of the plaintiff in this case is similar to one of the arguments used by French MEP Philippe Latombe in the ongoing proceedings before the EU General Court seeking to annul the DPF. The same argument has also been raised by other critics of the DPF.

Two of the present authors have argued in previous articles (see for instance here and here) that European human rights law as well as EU data protection law “prioritise substance over form” and “understand the term “law” in its substantive sense”, and hence EO 14086 should be considered as binding law under European legal standards. The EDPB had already accepted this position in  its Opinion 5/2023 on the European Commission Draft Implementing Decision on the adequate protection of personal data under the EU-US Data Privacy Framework, where it noted that both the European Court of Human Rights and the Court of Justice of the European Union  “do not base their assessment on purely formalistic criteria, but regard the substantive safeguards as decisive” (p.46). Now, for the first time, a Court in an EU member state has also clearly stated that EO 14086 should be considered as “a law in the substantive sense”.

  1. From the Zero-Risk Approach of DPAs to a Risk-Based Approach of National Courts?

The Traunstein decision, although that of a regional court, is an important one in adopting a risk-based approach to data transfers. However this is not the first judicial decision in the EU that seems to be tempted by a risk-based approach. Several other decisions of European Member States Courts appear consistent with a more flexible interpretation of Chapter V of the GDPR based on a series of considerations including the nature of data, the protections in place, the severity of the risk and the likelihood of unauthorized access to European data by the authorities of “non-adequate” countries.

As a matter of fact these previous decisions do not concern transfers that have already occurred in the U.S. (like in the Traunstein decision), but rather the second dimension of the application of the “zero-risk” theory (see above (1)), namely the risk of access by U.S. authorities to data stored in Europe by companies under U.S. jurisdiction. Several courts in EU countries have not followed the argument of the plaintiffs in these cases that hosting of European personal data by providers subject to extraterritorial laws should be ipso facto prohibited as contrary to Chapter V of the GDPR due to the theoretical risk of access by U.S. authorities.

In France, for instance, in the Doctolib” case examined by the Conseil d’Etat, the association InterHop and other applicants argued that the hosting of Doctolib’s data (concerning medical appointments for COVID vaccinations) by AWS entailed a risk that Chapter V of the GDPR would be breached, despite the fact that the data was stored in Europe. The Supreme Administrative Court, Conseil d’État, rejected the application. In an order dated 12 March 2021, the interim relief judge emphasised that:

“Doctolib and AWS have concluded a complementary addendum on the processing of data establishing a precise procedure in the event of requests for access by a public authority to data processed on behalf of Doctolib, providing in particular for the contestation of any general request or one that does not comply with European regulations. Doctolib has also put in place a security system for the data hosted by AWS using an encryption procedure based on a trusted third party located in France to prevent the data from being read by third parties. In view of these safeguards and the data concerned, the level of protection afforded to data relating to appointments booked as part of the Covid-19 vaccination campaign cannot be regarded as manifestly inadequate in the light of the risk of infringement of the General Data Protection Regulation invoked by the applicants”.

Similarly, the Conseil d’État rejected multiple times the efforts of different claimants to challenge the hosting of the French Health Data Hub’s (HDH) data by Microsoft, despite the French DPA’s intervention in favor of the claimants in which the CNIL argued that “requests from US authorities, issued under section 702 FISA or EO 12333, and addressed to Microsoft for processing operations subject to the GDPR, should be considered as disclosures not authorised by EU law, pursuant to Article 48 of the GDPR” (see here p. 28-29). While the Conseil d’Etat did not rule out the risk of requests by US authorities, it did not find this risk to be a violation of the GDPR and did not order the HDH to stop using Microsoft. The Court emphasised instead the fact that all the data remained localised in Europe, that several legal and technical protections (including triple pseudonymization of the health data) were in place, and that there was “a public interest” to continue the activities of the HDH, especially taking into consideration that no alternative solution was in place.

In Belgium, the Belgian Council of State, ruled on 16 July 2021 that the Flemish authorities’ decision to enter into a contract with a European branch of an American company using AWS cloud services did not breach the GDPR. The Council of State noted, in fact, that the data encryption solutions put in place by AWS, and the fact that the encryption keys were kept internally by the Flemish authorities, showed that the choice of AWS as a subcontractor was not contrary to Article 28 of the GDPR, as the claimant had not been able to demonstrate that the controller and the subcontractor had failed to implement the necessary technical and organisational measures.

In Germany also several courts’ decisions seemed to go towards the same direction by not endorsing the “zero-risk” approach to Chapter V pf the GDPR requested by the plaintiffs in similar situations concerning data localized in Europe by providers subject to U.S. jurisdiction.

On 7 September 2022, the Higher Regional Court of Karlsruhe considered that a “transfer” does not take place as long as the data remains in the EU, and thus held there is no reason to exclude a US company from public contracts in Germany. In doing so, that Court overturned a decision by the Baden-Württemberg Chamber of Public Procurement which had ruled against the use of US Cloud companies arguing that the mere possibility a foreign government may request access to personal data should be deemed an illegal “transfer” within the meaning of the GDPR, regardless of whether or not such access has actually taken place.

A similar position was adopted by the Federal Chamber of Public Procurement (a tribunal housed within Germany’s Federal Cartel Office similar to an Article I court in the US), which emphasised in a ruling handed down on 13 February 2023 that it was not appropriate to exclude the European subsidiaries of US cloud computing companies from invitations to tender for cloud services. The EU public procurement market is generally open to all companies, regardless of their nationality, the Federal Chamber of Public Procurement pointed out. Excluding companies on the basis of their nationality “would require the creation of a separate legal basis”, the Chamber noted.

Separately, in another decision, the State Labor Appeals Court in the German state of Baden-Württemberg rejected the complaint of a German employee who brought a claim for damages under the GDPR, based on his employer’s use of Workday as its HR management system.  The claimant argued that he had suffered “immaterial” harm because “data had been transferred to a country that does not guarantee effective protection of data against local public authorities”. Even though the Court assumed that the plaintiff’s had a legitimate claim of immaterial harm due to uncertainty about unauthorized government access to his data, the court held that, since the Standard Contractual Clauses were executed with Workday, the plaintiff could not trace this harm to “a violation of the GDPR.”

Similarly, the Hessian Administrative Court of Appeals rejected an injunction against the Rhein-Main University of Applied Sciences, which used the Cookiebot consent management platform. The plaintiff argued that the university should not be able to use these services because Cookiebot was using Akamai as a service provider. Akamai, who was apparently storing consent records within Europe, was purportedly subject to production requests under the US CLOUD Act. In rejecting the injunction, the Court of Appeal overruled the injunction issued by the Administrative Court of Wiesbaden (discussed here). The court of appeals found that an injunction had been inappropriate for a number of reasons, including the fact that transfers involved a “complex body of law” that could not be adequately addressed in summary injunctive proceedings. The court thus vacated the injunction, and retroactively imposed court costs from first-instance proceedings on the plaintiff who had brought the suit.  (RMU’s Privacy Policy still states it deploys Cookiebot and Akamai on its site.)

  1. Conclusion

The July 2024 decision by the Regional Court of Traunstein is important on more than one account.

  • It emphasizes, for the first time, the inevitability and legality of international transfers of data by a global social network which cannot function without such transfers. Users of such a worldwide network, stresses the Court, should obviously be aware that the data they post on such a network will be transferred to other countries where the network operates, and cannot challenge these transfers. After all, they are not obliged to post anything in such a platform if they don’t wish their data to be transferred internationally.
  • It adheres to a risk-based approach by not considering the mere existence of a generalized, undefined risk arising from US foreign intelligence programs when addressing whether transfers of content published by a user in a global social media network are lawful.
  • It stresses that social media users do not have the right to demand that the platform is operated in such a way that all data is stored and processed in Europe.
  • Finally, it rejects the argument that the new adequacy decision could be challenged on the grounds that the Data Privacy Framework has been enacted by a Presidential Executive Order instead of a statute, emphasizing that such an order can be considered as binding law under European law standards, which focus on substance, not on form.

The Traunstein decision is unique of its kind as it is the first to reject so openly the “zero-risk” approach for data transferred to the U.S. before the adoption of the adequacy decision. However, other decisions of Courts of EU Member States, concerning this time the risk of access to data localized in Europe by providers under U.S. jurisdiction, have also cast doubt on the “zero-risk” approach. All these courts, taken together, favored a more flexible interpretation of Chapter V of the GDPR based on a series of considerations including the nature of data, the protections in place, the severity of the risk and the likelihood of unauthorized access to European data by the authorities of “non-adequate” countries.

The Traunstein decision shows awareness of the fact that DPAs in Europe have expressed different opinions on many of these aspects. The Court noted, nonetheless, that: “Insofar as data protection authorities hold differing opinions, these are not binding on the court”.

This case is a reminder of the fact that the ultimate decisions on issues related to Chapter V of the GDPR are not given by DPAs but by judicial bodies. To date, several Member States court decisions have not followed the “zero-risk” approach requested by the plaintiffs.  Ultimately, the European Court of Justice may well become the final arbiter between the “zero-risk” and the risk-based approach to Chapter V of the GDPR.

 

 

These statements are attributable only to the authors, and their publication here does not necessarily reflect the view of the Cross-Border Data Forum or any participating individuals or organizations.

 

Authors