Non-fitted devices in the UK Home Office’s surveillance arsenal: Investigating the technology behind GPS fingerprint scanners

Privacy International’s technical research on the so-called non-fitted devices (NFDs) used by the UK Home Office to track migrants shows that these devices are intrusive and stigmatising by design. The use of NFDs is an expansion beyond the use of GPS ankle tags of the UK’s surveillance of migrants who are on immigration bail and subject to electronic monitoring conditions.

By Privacy International (guest author) · November 20, 2024

The UK Home Office’s Electronic Monitoring Programme

With the ongoing expansion of GPS tagging under the UK Home Office’s electronic monitoring programme, the Home Office has increasingly deployed non-fitted devices (NFDs) that track a person’s GPS location and request frequent biometric verification in the form of fingerprint scans.

The NFDs deployed by the UK Home Office are small handheld devices with a fingerprint scanner that record a person’s location 24/7 (referred to as their trail data). Most recently, migrants are being provided with Android mobile phones equipped with a fingerprint scanner at the back, which are also equipped with 24/7 GPS tracking technology. The devices alert the person at random intervals throughout the day requesting their fingerprint scan, which it then compares against a representation of the biometric information stored on the device. This is in order to verify that the person has the device with them at all times for the purposes of tracking.

Previous research PI did in relation to the NFDs determined that these alerts are sent to the wearer up to 5 times per day. Although as set out below, there are reports that individuals are in fact being sent many more alerts than this in a given day. Those subjected to these devices have cited detrimental impacts to their daily life and mental and physical wellbeing, due to the pervasive and erratic nature of these devices.

Research found this tech to be inhumane by design

PI’s research looked at several features of NFDs that individuals have reported as harmful and detrimental to their wellbeing and the extent to which these have been exacerbated by the technical design of the devices:

  • Unreasonable and unforeseeable alert time periods across 12 hours or more (e.g., in the middle of the night or when visiting the shops or a place of worship), which resulted in social stigma and anxiety about being constantly surveilled
  • Unreasonable alert response times (i.e., having too little time to provide one’s fingerprints), which could unfairly work against the individual
  • Unreasonable or otherwise too-strict match probabilities (e.g., requiring unrealistically high percentage matches, false positives, false negatives)
  • High volume of alerts that can go off at any time (including in public), which can worsen feelings of social stigma and anxiety at being monitored at all times

PI’s tech researchers wrote pseudocode and created a mockup of the app built from this pseudocode to demonstrate how the NFD technology is capable of being far less intrusive than the above reported features. Our pseudocode was able to structure more reasonable alert time periods that did not have to be across 12 hours or more, for instance, and the code also allowed for customisable alert response times that might be reasonably longer.
The software development kit (SDK) of the test device also allowed the customisation of the pass/fail threshold for fingerprint match scores to make the threshold more reasonable, and the researchers were also able to customise the volume of alerts that can go off at any time to be far more reasonable, such as just once a day. They were also able to underline how 24/7 GPS tracking is neither necessary nor required for the purposes of monitoring; instead of constantly tracking a person’s GPS location in the background, the pseudocode implementation only logs the person’s location data when they scan their fingerprint. This could allow for much more limited locational and biometric data collection – such as once a week during a scheduled check-in.

What the pseudocode shows is that the existing technology for GPS fingerprint scanners, which is already a highly intrusive surveillance technology, is designed in a particularly disproportionate and arbitrary way that fails to respect the basic right to dignity of those subjected to them. Instead, the Home Office appears to be characteristically deploying intrusive technology without adequate consideration for the human rights impacts on those subjected to monitoring.

Moving forward: ensuring human dignity and right to privacy

From what has been reported about NFDs and what the research above shows is technically possible, the UK Home Office has either not considered or disregarded the possibility of configuring fingerprint and locational tracking at less intrusive degrees. More widely, this is indicative of an immigration system built on data exploitation and surveillance.

The human cost of these kinds of migration surveillance programmes are individuals who are made to feel like ‘caged animals’ – constantly thinking about when the next alert might go off – when they are on a bus, seeing friends, or trying to sleep.

The Home Office should consider more humane approaches to how they use new technologies to ensure that they do not further dehumanise migrants. They should instead take a deliberate approach to protect and respect their right to live in dignity, free from stigma and arbitrary surveillance as these individuals navigate an increasingly hostile environment.

In light of these research findings, PI will continue to investigate abuses perpetrated by NFDs and the UK Home Office’s wider electronic monitoring arsenal, as well as continue to monitor this and other migration surveillance technologies. Recently, for instance, PI received a FOIA response regarding IPIC (“Identify and Prioritise Immigration Cases”), the UK Home Office’s secretive AI tool which automatically identifies and recommends migrants for enforcement action. Further updates and disclosures on algorithmic decision-making deployed by the Home Office are expected, and PI will continue to provide updates on the space and investigate these migration surveillance tools that must be scrutinised and challenged.

Contribution by: EDRi member, Privacy International