UK policing and justice bodies must be able to prove that their increasing use of public cloud infrastructure is compliant with law enforcement-specific data protection rules, says the biometrics commissioner.
As commissioner for the retention and use of biometrics, Fraser Sampson is responsible for overseeing police use of DNA and fingerprints in England, Wales and Northern Ireland. He is also responsible for monitoring the use of surveillance cameras in public spaces under his role as surveillance camera commissioner.
During an appearance before Parliament’s Joint Committee on Human Rights in February 2023, Sampson noted there was a “non-deletion culture” in UK policing when it came to the retention of biometric information.
Much of this biometric information is now either held or being moved to hyperscale public cloud infrastructure, which opens data subjects up to a number of risks.
Speaking with Computer Weekly about the use of hyperscale public cloud providers to store and process sensitive biometric data, Sampson said the “burden of proof is on police as [data] controllers, not just to provide the information and assurances, but also to demonstrate that their processing complies with all the relevant [data protection] requirements”.
He added the burden of proof was not just a matter of law, but of governance, accountability and building public trust in how the police are using new technologies.
The move to cloud
In April 2023, Computer Weekly reported that the Scottish government’s Digital Evidence Sharing Capability (DESC) service – contracted to body-worn video provider Axon for delivery and hosted on Microsoft Azure – is currently being piloted despite major data protection concerns raised by watchdogs about how the use of Azure “would not be legal”.
According to a Data Protection Impact Assessment (DPIA) by the Sottish Police Authority (SPA) – which notes the system will be processing genetic and biometric information – the risks to data subjects’ rights include US government access via the Cloud Act, which effectively gives the US government access to any data, stored anywhere, by US corporations in the cloud; Microsoft’s use of generic rather than specific contracts; and Axon’s inability to comply with contractual clauses around data sovereignty.
There is also a concern that transferring personal data – including biometrics – to the US, a jurisdiction with demonstrably lower data protection standards, could in turn negatively impact people’s data rights to rectification, erasure and not be subject to automated decision-making.
While the SPA DPIA noted the risk of US government access via the Cloud Act was “unlikely… the fallout would be cataclysmic”.
The release of the SPA DPIA also brings into question the lawfulness of cloud deployments by policing and criminal justice bodies throughout England and Wales, as a range of other DPIAs seen by Computer Weekly do not assess the risks outlined by the SPA around US cloud providers, despite being governed by the same data protection rules.
In December 2020, for example, a Computer Weekly investigation revealed that UK police forces were unlawfully processing more than one million people’s personal data – including biometrics – on the hyperscale public cloud service Microsoft 365, after failing to comply with key contractual and processing requirements within Part Three of the Data Protection Act 2018, such as restrictions placed on international transfers.
In particular, the DPIAs disclosed to Computer Weekly via Freedom of Information requests showed that the risks of sending sensitive personal data to a US-based company, which is subject to the US government’s intrusive surveillance regime, were not properly considered.
Other uses of US cloud providers throughout the UK criminal justice sector include the integration of the Ident1 fingerprint database with Amazon Web Services (AWS) under the Police Digital Services (PDS) Xchange cloud platform; and the HM Courts and Tribunals’ cloud video platform, which is partly hosted on Azure and processes biometric information in the form of audio and video recordings of court proceedings.
“Past performance is a good predictor of future performance, and the police rap sheet on data protection and databases isn’t great,” said Sampson, pointing to a 2012 High Court ruling that found the retention of custody images by the Metropolitan Police to be unlawful on the basis that the information about non-convicted people was being treated in the same way as information about people who were ultimately convicted, and that the six-year retention period was disproportionate.
“We’ve still got millions of photographs, we don’t even know how many millions of photographs, unlawfully retained on a database, the excuse for which seems to be it wasn’t built with bulk deletion capability.”
While these custody images – retained in the Police National Database (PND) – are not currently held on cloud infrastructure, the Home Office has plans to move both the PND and Police National Computer (PNC) to a cloud-based platform.
Risks must be assured and mitigated
Sampson said policing and justice bodies need to be assured that the risks to people’s data rights have been mitigated to a level that is appropriate for the database or information stored on it, and that these bodies need to welcome the burden of proof they face.
“If you want the public to have trust and confidence in your kit and what you’re doing, and your contracted partners, then you have to be able to show that,” he said. “This is not just a nuisance. If people ask these questions of the police, that is not simply a nuisance, and it’s not someone trying to catch them out – it’s an elemental function of leadership and governance in policing that we don’t only respond to these challenge questions, we invite them.”
Fraser Sampson, biometrics and surveillance camera commissioner
He added that policing bodies, if they have done the proper due diligence, ought to be able to answer questions about their cloud deployments “immediately and unequivocally, and should push for the opportunity to do so”.
Sampson further added that senior police officers should be able to explain to their communities what the use of cloud means for their personal data.
“Cloud is a brilliantly fluffy euphemism that doesn’t actually tell you anything about the system,” he said. “What you want to know is, ‘What country is my data being stored in and what does that mean?’. It’s really basic. And what are the risks of that then being accessed either maliciously or judicially?”
Policing and justice bodies must also be conscious of the risks that relying so heavily on certain suppliers and systems can create. “We’re creating more and more dependencies for operational policing and law enforcement on these systems, and where you create a dependency, you create risk,” said Sampson.
He added it was important for issues to be dealt with upstream during the initial procurement process, rather than after the fact when problems arise, as it can lead to situations where decision-makers perceive the cost of exiting a contracted service to be too high, even if there are clear issues.
“Those strategic questions must be questions for the Home Office and Police Digital Service right at the beginning,” he said. “This is the first conversation I’ve had about any of it. No one has discussed this with our office.”
For Sampson, part of the problem is that across UK policing, there is a low level of understanding or oversight around what the systems they are deploying are capable of, and how data should be handled within them.
For example, while the Data Protection Act and the UK General Data Protection Regulation (GDPR) both came into force in May 2018, the vast majority of policing DPIAs seen by Computer Weekly do not mention the law enforcement-specific rules laid out in Part 3 of the former.
Sampson noted, however, that in his meetings with police forces and justice bodies, he has never heard mention of Part 3, or the European Union’s Law Enforcement Directive on which it is based.
“Unless you have that understanding, and you’ve assured yourself with all these challenge questions, how can you possibly assure the people whose money you used to buy it and who’s very sensitive personal data you’re using on it?” he said.
“I think they need to assure themselves that they’ve identified and covered off the risks, and if they haven’t, then they need to be talking to the Home Office and the data regulator about how they how they address those.”
In the first annual report covering his dual function, which was delivered to home secretary Suella Braverman in November 2022 and laid before Parliament on 9 February 2023, Sampson highlighted the purchase of facial recognition technology from Chinese firm Hikvision by UK public authorities – including police – as a national security risk, given the number of sensitive sites the equipment was being installed in.
Sampson noted that while many cite China’s national security laws as a concern – on the basis that any firm headquartered there could be forced to hand over data to the Chinese government – the same is true of almost every government.
“It isn’t just the Chinese that will do it. I’m sure if you’ve got something of national security importance on your Ring doorbell, we’ll find a way of forcing you to disgorge it,” he said.
Pointing to the example of UK police drones, Sampson added when he asked the main supplier where all the captured video footage from the drones’ cameras was being stored, its use of AWS was offered as a safeguard.
Fraser Sampson, biometrics and surveillance camera commissioner
“This isn’t all stored in China, this is stored on AWS, and therefore you haven’t anything to worry about. That’s the reassurance they provided,” he said. “It’s interesting because this isn’t a country of origin challenge, it’s an assurance around processes challenge. It doesn’t really matter, in some respects, where it’s housed – it’s the degree to which you’ve got some sovereignty.”
He added this issue of foreign government access would only increase in importance as UK law enforcement entities increasingly rely on citizen-captured data in investigations.
As an example, he noted that while tightly regulated DNA evidence contributes to around 1% of investigations, digital evidence – from mobile phones, home security cameras, dashcam footage, and so on – likely contributes to around 70%, but has nowhere near the same level of oversight around storage and processing.
“The more we rely on citizen-captured data, the more important that storage and access question is going to be,” he said, adding these issues will ultimately need to be resolved by the Home Office, given the use of hyperscale cloud is now a UK-wide policing issue that goes far beyond Sampson’s narrow biometrics remit.
Computer Weekly asked the Information Commissioner’s Office (ICO) about the prevalence of US cloud providers throughout the UK criminal justice sector, and whether their use is compatible with UK data protection rules, as part of its coverage of the DESC system. The ICO press office was unable to answer, and has referred Computer Weekly’s questions to the FOI team for further responses.
Computer Weekly also contact the Home Office, but received no response.