Mass surveillance of external travellers may go on, says EU’s highest court
On 21 June 2022, the Court of Justice of the European Union (CJEU) delivered its judgment in case C-817/19 Ligue des droits humains from Belgium which challenged the validity of the Passenger Name Records (PNR) Directive. Regrettably, the PNR Directive as such was found to be compatible with the Charter of Fundamental Rights.
On 21 June 2022, the Court of Justice of the European Union (CJEU) delivered its judgment in case C-817/19 Ligue des droits humains from Belgium which challenged the validity of the Passenger Name Records (PNR) Directive. Regrettably, the PNR Directive as such was found to be compatible with the Charter of Fundamental Rights. When Member States implement the Directive, a number of key provisions must be interpreted in a way that is clearly contrary to the wording of the Directive and, arguably, the intent of the EU legislators. In effect, the CJEU amends the PNR Directive to make it compatible with fundamental rights.
This approach is a notable departure from the previous rulings on the Data Retention Directive in April 2014 and the EU-Canada PNR agreement in July 2017, both of which were invalidated by the Court because key provisions were incompatible with the Charter. Even more concerning is the reasoning of the Court, which in the new judgment seems to take a more permissible view of the general and indiscriminate collection and retention of personal data that is highly intrusive to the private lives of the persons concerned. For PNR data, this affects almost everybody given ‘the common use of air transport services’ and the long data retention period, as noted by the Court itself: if one travels every five years by plane, one’s data is retained constantly.
Skewing the PNR Directive to make it compatible with the Charter
The judgment follows the Opinion of the Advocate General quite closely, with two notable differences.
First, PNR data for an intra-EU flight cannot be collected in a general and indiscriminate manner, except for the sole situation where a Member State is confronted with a terrorist threat which is shown to be genuine and present or foreseeable. Absent such a threat, only targeted collection of PNR data is permitted for intra-EU flights. For extra-EU flights, the Court finds that collection of PNR data cannot be limited to a particular group of passengers, given the very nature of the threats from terrorism and serious crime that may stem from the carriage of passengers between a third country and the EU. This reasoning by the Court is really a blanket acceptance of the premise of the PNR Directive: everyone is a suspect until proven otherwise by database lookups and algorithmic checks. At least, the Court’s restrictions on PNR collection for intra-EU flights will have positive implications for the privacy and data protection of people travelling within the EU and exercising their right to freedom of movement. The use of PNR data for border control at internal Schengen borders is also prohibited by the judgment. The ruling on intra-EU flights will be a clear obstacle to Member States that consider PNR data collection for trains and buses as these modes of transport will be mostly for intra-EU travel.
Secondly, the CJEU finds that retention of all PNR data for the initial period of six months is compatible with the Charter, whereas the full retention for five years must be limited (“targeted”) to passengers for which a connection to terrorism or serious crime has been established, either before the flight or during the initial six month period. However, the concept of targeted data retention becomes rather meaningless when the targeted group of persons can be identified through data mining analysis of every passenger’s PNR data. This is a concerning departure from previous CJEU rulings on retention of electronic communications data, where the targeted data retention must be justified by objective criteria other than data mining analysis of the personal data that potentially can be collected and retained.
Like the Advocate General Opinion, the Court finds that certain PNR elements, in particular, the ‘general remarks’ field, do not meet the requirements for clarity and precision of EU law. Member States must omit these PNR elements when implementing the Directive. This is important because the ‘general remarks’ field may contain sensitive personal data, e.g. meal requests that indirectly reveal the passenger’s religious practices or political orientations. Member States must also ensure that requests to the Passenger Information Unit (PIU – police units which are in charge of receiving, processing and sharing PNR data with the competent national authorities, Europol and third countries) for access to PNR data are reviewed by a court or independent administrative authority, even during the initial six month period where this is not expressly provided for in the PNR Directive. Nonetheless, the Court decides that this interpretation is consistent with the intention of the EU legislature because recital 25 of the Directive includes the wording ‘to ensure the highest level of data protection’.
Trust Member States to collect large amounts of data
On several key provisions, the Court grants a disproportionate degree of trust in the Member States to apply the PNR Directive in a restrictive way to meet the requirements of the Charter. For example, the Court counts on Member States to restrict the use of the PNR surveillance system in the fight against terrorism and serious crime, although the Directive does not adequately prevent risks of abuse by investigative authorities and the use of PNR data for ordinary crime.
On the issue of comparing PNR data to “relevant” databases, the Court admits that the wording of the Directive fails to meet the criteria of clarity and precision as it does not sufficiently specify which type of databases are eligible for cross-checking (managed by public authorities or private entities, by law enforcement or intelligence services, related to fight against terrorism and serious crime or not, etc.), thus leaving the door open for data mining operations. However, it skews the wording of the legislation so as to trust PIUs that they restrict their comparison with a limited set of databases in order to ensure respect for fundamental rights. One of the eligibility criteria is the non-discriminatory nature of the database, that is to say, that the persons’ data are inserted ‘based on objective and non-discriminatory factors’. Considering the general hardwiring of discriminatory policing in law enforcement technological tools and notably in police databases, it is doubtful that any database can meet this requirement.
Error-prone automated suspicion of travellers is OK – with a manual review
The Court recalls that, according to the Commission’s own impact assessment, five out of six individuals are falsely identified in the automated analysis of PNR data. This in itself should be ample evidence that the automated analysis employed is not suitable for the objective of preventing terrorism and serious crime and thus fails the necessity test. Indeed, predicting crime does not appear to work outside movie plots like ‘Minority Report’. Regrettably, the Court does not explore this angle, e.g. by establishing criteria about precision that the automated analysis must satisfy.
Instead, the Court simply notes that automated analysis necessarily presents ‘some margin of error’, and that the appropriateness of the system depends on the subsequent manual verification. If ‘some margin of error’ can be error rates above 80%, there are no real limits on what error-prone automated analysis Member States can employ, as long as there is a manual review of the automated matches. Unfortunately, the Court fails to take into account the impact of confirmation bias and the “presumption to intervene” on the decision-making assisted by technologies in the context of policing.
On the positive side, the Court’s insistence on effective manual review of automated matches puts considerable restrictions on the type of automated analysis that can be used. For the manual review to be effective, it must be possible for the reviewer to understand why the program has arrived at a positive match. The Court specifically notes that this requirement precludes the use of artificial intelligence (AI) technology in self-learning systems (machine learning). Such systems are liable to make the manual review redundant, and the Court goes as far as noting that they may deprive data subjects of their right to an effective remedy enshrined in Article 47 of the Charter.
On the face of it, this sets a fairly high bar for the automated analysis systems used by law enforcement authorities. In practice, it will be critical that the procedures and algorithms used by PIUs for analysis of PNR data are subjected to effective independent oversight by Data Protection Authorities. Since AI is a popular technology in the law enforcement and border control area, the implications of the judgment will extend considerably beyond the PNR Directive. Europol has recently been granted new powers for bulk collection of personal data with a view towards using data mining analysis to identify persons of interest. In the ETIAS Regulation, currently undergoing implementation by Member States, an automated analysis will be used to assess the risk of third-country visitors to the EU (with a manual review of positive matches).
Contribution by: Jesper Lund, Chairman of EDRi member IT-Pol Denmark and Chloé Berthélémy, Policy Advisor, EDRi