This time it’s patients’ biometrics — their most sensitive data of all — that appear to be at risk.
The UK’s National Health Service’s rush to embrace remote care has not only left many patients struggling to see a doctor in person; it has also opened up a rich vein of data mining and data managing opportunities for big tech companies, including Google and Microsoft. Even the controversial US spy-tech firm Palantir, which was founded with support from the CIA in 2003, has shared in the spoils. But then, in late May, a scandal broke, as I reported just over a month ago in Going, Going, Gone: UK Government Speeds Up Privatisation of National Health System:
Managers at NHS Digital [had come] up with an ingenious plan to digitise and share up to 55 million patients’ private heath data with just about anyone willing to pay for it. That data includes sensitive information on physical, mental and sexual health, as well as gender, ethnicity, criminal records and history of abuse. It could even include a patient’s drug or alcohol history. The NHS Digital managers kindly allowed patients to opt out of the scheme; they just didn’t bother telling them about it until three weeks before the deadline, presumably because millions of patients opting out of the scheme would have meant less money for the NHS.
When the FT finally broke the story, a scandal erupted. NHS Digital officials have since scrapped the scheme, saying they now want to focus on reaching out to patients and reassuring them their data is safe.
Unfortunately, the scandal doesn’t appear to have spurred much in the way of meaningful change at NHS Digital. This past weekend, just three months after the agency’s last data-related scandal, another one broke. Undisclosed companies, it turns out, are managing facial recognition data collected by the NHS App, which now has 16 million English users (just under 30% of the country’s population). The story, broken by the Guardian, has sparked fresh concerns about the safety of patient data in the hands of (often unidentified) private businesses:
Data security experts have previously criticised the lack of transparency around a contract with the NHS held by iProov, whose facial verification software is used to perform automated ID checks on people signing up for the NHS app.
The Guardian now understands that French company Teleperformance, which has attracted criticism in the UK over working conditions, uses an opaque chain of subcontractors to perform similar work under two contracts worth £35m.
The NHS App, not to be confused with the Covid-19 app, can be used to perform a whole host of functions including booking GP appointments and ordering repeat prescriptions. Its number of users has doubled since May, when it became the easiest means of accessing the NHS certificate proving an individual’s Covid-19 vaccination status (before the pandemic it had fewer than a million users). In the absence of an official vaccine passport in the UK, the NHS App has become the next best thing.
To access the app’s services users must go through an ID verification process. Some are directed to an automated process powered by iProov’s software. Failing that, the NHS app resorts to manual checks, in which users record a short video of themselves reading out a set of four numbers, as well as uploading an ID document. The video is then sent to a team of identity checkers, who compare the ID photo with the user’s face in the video. But the public has no way of knowing what outsourcing companies are performing those checks, or under what terms and conditions.
The NHS also appears to be sharing the facial recognition data with law enforcement bodies, but apparently only after a special panel has analysed the formal request. An expert in surveillance law cited by The Guardian said such information was also likely to be of interest to UK and foreign intelligence services:
“If GCHQ acquired it and it was of use, the likely position is that they would share that with the [US] National Security Agency.”
As with previous scandals, the lack of transparency appears to be a feature, not a bug, of NHS Digital’s outsourcing practices. Until now most patients haven’t even been made aware of the fact that their most sensitive data is up for sale, or that it could end up being managed by deeply conflicted companies like Palantir. This is a company that specialises in online surveillance. Its main line of business is to provide data-science support to US military operations, mass surveillance, and predictive policing. In February, Palantir’s chief operating officer told investors that Palantir was driving towards being “inside of every missile, inside of every drone.”
It’s easy to understand why Palantir may want to diversify into health sciences in the UK (just as it is in the US): there are huge amounts of money to be made. Last year alone it racked up £22 million in profits on the back of its NHS data deals. But it’s a lot harder to fathom why the UK’s National Health Service — the first health system in any Western society to offer free medical care to the entire population — would partner with a company that deals in death on such a gargantuan scale. “Their background has generally been in contracts where people are harmed, not healed,” said Cori Crider, the lawyer who co-founded Foxglove.
The good news is that after strong pressure from Foxglove and Open Democracy, the UK government finally relented and rescinded the NHS’s contract with Palantir earlier this month. But it’s impossible to know what will happen to all the data Palantir has managed once the contract is up. What’s more, Palantir is not completely out of the picture. Through its investments in London-based start-up Babylon Health, which provides AI-powered digital checkups and helps users navigate the UK NHS system, the spytech giant still has its fingers in the NHS pie.
Continue reading on Naked Capitalism