Blogs | Privacy and data protection | Data protection standards | Privacy and confidentiality

Trust is a two-way street: the UK’s digital identity framework

Trust is a two-way street, and while the British government’s digital identity trust framework definitely makes steps in the right direction, those efforts need to be accompanied by a commitment to transparency and integrity, says EDRi's member Open Rights Group in response to the government's policy paper on UK’s digital identity and attributes trust framework.

By Open Rights Group (ORG) (guest author) · March 24, 2021

Ciry London

Open Rights Group (ORG) has recently responded to the UK’s Department for Digital, Culture, Media, and Sport (DCMS) policy paper on the UK’s digital identity and attributes trust framework. This consultation followed the call for evidence on digital identity published in 2019, which ORG also responded to, and additional builds on government’s direct and ongoing engagement with civil society groups, including ORG, on digital identity issues.

In their response, ORG commended government for the progress made so far, in particular its commitments to

  • Establish a governance and oversight function over the trust framework;
  • Not use the trust framework to build an identity card system;
  • Build the system with interoperability standards between schemes, sectors, and geographies in mind;
  • Restrict the use of personal data;
  • Ensure that data is not used to profile users of any age for marketing purposes, create aggregated data sets which could be misused or reveal sensitive personal data; and
  • Not process digital identities, or attributes, without the person’s consent.

However, there are some issues raised by the policy paper which ORG felt will require further discussion and clarification.

Safeguards or blockers?

The policy paper, as well as the previous call for evidence, refer to “developing proposals to remove legislative and regulatory blockers to the use of secure digital identities and establish safeguards for citizens”. But what does that mean in practice? We still don’t know. Characterising the legal safeguards around our data as obstacles, without any understanding of what that means, risks the safeguards which protect our data from abuse.

ORG’s position is clear: any changes to laws and regulation, whether they involve the data protection laws which restrict the sharing of personal information, or the government standards for sharing that data, must be brought forward for transparent public debate.

Likewise, there is a poor discussion of user needs, and the need to build the system from a person-centred approach, other than the necessary standards of inclusivity required by the Equality Act. A more considered approach on how to build a system that creates user trust, and makes accountability accessible to the many, is key for this framework

Standards and interoperability

Interoperability is central to the policy paper. In looking forward to how best to build this system, ORG want government to consider past failures whose lessons do not seem to have been taken into consideration in the trust framework.

For example, the framework continues to cite GPG standards. These standards have, in the past, contributed to individuals failing to meet the necessary standards for accessing Universal Credit. In other words, GPG standards create barriers to entry for both users and providers. If these standards are retained as mandatory, or a system is built which makes them mandatory, those systematic exclusions will continue. Relying on standards which can exclude the individuals whom a system is built for, or excluding providers from offering a service because they are incompatible with ageing standards, risks repeating the mistakes of the past.

Likewise, the absence of dialogue with existing standards, such as the Privacy and Consumer Advisory Group principles or ID4D standards, is concerning. If the UK’s trust and framework makes too much of an attempt to go it alone, interoperability could become all the more harder to create.

ORF want to see high legal and technical standards deployed to protect user privacy. But the balance must tip in favour of the individuals, not the institutions. Standards must be flexible but not fragile, in a way which would allow bad faith actors to exploit digital identity systems. Standards must also be high to ensure confidence in the stated identity, but not so high as to create an impossible barriers for those who – through no fault of their own – lack a strong identity footprint.

Fraud prevention

The trust framework sets out a raft of counter-fraud and security measures. In ORG’s view, these measures are satisfactory for those aiming to utilise the trust framework to look for individuals committing identity fraud. But ORG are concerned by how the framework fails to provide additional information on how providers should combat fraud committed against individuals, by those pretending to be affiliated with institutions.

In any identity framework, if the trust only flows one way – the individual proves who they say they are to the instititution – there is a trust deficit and a risk of fraud towards the individual. What about the institution proving who they say they are to the individual? Counterfraud measures need to flow both ways, and ORG would like to see the digital identity strategy expand in that area.

Trust is a two-way street, and while the government’s digital identity trust framework definitely makes steps in the right direction, those efforts need to be accompanied by a commitment to transparency and integrity. ORG look forward to continuing to engage with government on these difficult issues.

The article was first published here.

(Contribution by: Matthew Rice & Heather Burns)