Fundamental Rights Agency report: The risks from biometrics and EU IT systems
On 27 March 2018, the Fundamental Rights Agency (FRA) published a report entitled “Under watchful eyes: biometrics, EU IT systems and fundamental rights” . The report analyses the impact of technologies used for immigration and security purposes on the right to privacy and data protection.
Currently, three large-scale IT systems relying on biometric data exist in the EU: an asylum, a migration-management and a internal security system . In addition to this, there are advanced plans to set up three new systems for security and border management purposes. Furthermore, legislative proposals have recently been tabled by the European Commission to make these systems interoperable by creating a common search portal and by establishing a common repository with core biographic data of individuals whose data is stored in the different IT systems. The FRA report specifically analyses how large-scale IT systems and interoperability impact fundamental rights enshrined in the Charter of Fundamental Rights of the European Union.
The FRA argues that use of large-scale IT systems presents both risks and opportunities for fundamental rights. On one hand, these systems allow for more robust and timely protection in the field of immigration and security, preventing identity fraud and theft. However, since individuals whose data are stored in large-scale IT systems are in a weak position (think of migrants or asylum seekers), there are many risks for fundamental rights.
First, these systems do not always provide information in an understandable and transparent manner. According to FRA researchers, individuals are not always fully informed about their data being processed and even in case they are, they face difficulties understanding how it is processed. In order to solve this, the FRA has issued three opinions emphasising the need to provide information that covers “all purposes of the data processing and must include information on the subjects rights, in an understandable and transparent manner”.
Second, industry needs to ensure the protection of fundamental rights when designing new solutions. As the state of the art of technology determines the options that the EU and its Member States have when creating new systems, industry and scientific research play a major role when coming up with technical solutions. Therefore, according to FRA, experts on personal data protection should be involved when designing new solutions, to make sure the principles of data protection are embedded in these technologies by design and by default.
Third, these systems need to have strong safeguards to prevent unlawful access to data. The use of IT systems to control irregular migration and to fight crimes and terrorism raises the risk of data being used for purposes not initially envisaged. The risk is even higher with interoperability between IT systems. Therefore, when personal data isbeing processed, the purposes of the processing have to be specified and explicitly defined. In FRA’s opinion, safeguards should be put in place to ensure lawful access to data stored in IT systems in the field of asylum and migration.
Fourth, sharing personal data with third (i.e. non-EU) countries can present risks for persons in need of international protection. In its opinion, the FRA suggests that Member States must “take measures to prevent information that a third-country national has lodged a claim for international protection from being shared with third countries”.
A careful evaluation needs to be done in order to assess the impact that law enforcement authorities access to data stored in IT systems may have on fundamental rights. Most of EU IT systems contain data about individuals who are not suspected of having committed any crimes. Therefore, the FRA suggests that the EU and its Member States should carefully assess the impact that law enforcement access to data stored in IT systems in the field of immigration may have on fundamental rights. The EU legislators should make sure that the “cascade system” safeguards, obliging Member States to first consult national databases linked to criminal investigations before consulting EU IT systems, is retained.
Data quality needs to be ensured. Mistakes in the IT systems used in the field of asylum and migration management can have damaging consequences for individuals. FRA research identified some inaccuracies. Therefore, in order to improve data quality, the FRA suggests that the Council of the EU should “continue to put data quality issues on the agenda to promote the implementation of best practices. FRA also suggests that EU Member States should involve the persons whose data are collected and used in verification procedures”.
Finally, rights of individuals need to be ensured. This means ensuring the effective exercise of the right of access, correction and deletion of personal data. Even though data quality issues are recurrent, complaints about incorrect or unlawful data use are rare. This demonstrates a lack of awareness and understanding of how to exercise the right of access, correction or deletion of inaccurate stored data. Furthermore, this may be exacerbated if IT systems are made interoperable. In this respect, the FRA suggests that EU Member States should raise people’s awareness on their right to access their personal data and that simplified procedures to access, correct and delete personal data should be put in place.
Under watchful eyes – biometrics, EU IT-systems and fundamental rights (28.03.2018)
EU wastes no time welcoming prospect of Big Brother databases (15.05.2017)
Norway introduces forced biometric authentication (26.07.201)
Swedish border control becomes a privacy nightmare for travellers (13.01.2016)
(Contribution by Margaux Rundstadler, EDRi intern)