The potentially illegal two-year deal, which began on December 12, was awarded under the Crown Commercial Services G-Cloud 11 Framework, a ‘streamlined’ – i.e. much-accelerated – system typically used for minor contracts, which doesn’t require a tender to be published. Under the terms of the agreement, the store will rely on Palantir’s Foundry data until at least December 2022.
The store was established in March to manage Covid-19 data and inform the government’s response to the virus. Palantir was one of several tech firms hired for the project, for the princely sum of £1, and ever since has been granted access to sensitive data such as patients’ ages, addresses, health conditions, treatments and whether they smoke or drink, among other private information.
Such information is clearly a highly valuable commodity, exclusive oversight of which is fraught with opportunity for abuse, but Whitehall insists any personally-identifying details are aggregated or anonymized prior to being shared with Palantir et al.
Despite these reassurances, the deal did not go unchallenged – not least because the contracts underpinning the deal weren’t published until June, mere hours before a legal action brought by political campaigning website openDemocracy and law firm Foxglove to secure their release was due to commence.
A very secret weapon
An even cursory review of Palantir’s operations starkly underlines why so many should be concerned about its involvement with the NHS, and its access to such intimate patient particulars.
Co-founder Peter Thiel’s inspiration for the company was a desire to repurpose the fraud recognition systems of PayPal – which he also co-founded – for defense and security applications. Most established investors weren’t interested in his pitch, but it caught the attention of In-Q-Tel, the little-known venture capital wing of the US Central Intelligence Agency (CIA), which gifted the start-up US$2 million in 2004.
A decade later, Palantir was valued in the billions and pulling in hundreds of millions of dollars annually, primarily from US government agencies, including the CIA, Departments of Defense and Homeland Security, Federal Bureau of Investigation, Air Force, Marine Corps, and Special Operations Command.
The company’s ‘Gotham’ platform pools these entities’ disparate and sprawling databases and allows for the effective sorting, management and cross-referencing of information contained therein.
These capabilities can be put to predictive purposes – for example, soldiers in Afghanistan used the tool to combine maps, intelligence reports, and reports of roadside bombings to plan missions. As a result, Bloomberg dubbed Palantir the War on Terror’s “secret weapon,” and Gotham also became of intense interest to law enforcement agencies the world over.
Due to the veil of secrecy surrounding Palantir’s commercial activities, the total number of forces employing the technology globally is unknown – although the company isn’t only opaque about what services it provides to which clients, but has been outright dishonest at the nature of its government partnerships in the past.
In 2018, it was revealed police in New Orleans utilized Palantir software to trace targets’ ties to gang members, link suspects’ criminal histories, analyze social media, and forecast the likelihood individuals would commit crimes or become a victim thereof.
Several high-profile convictions of violent, murderous drug gangs were secured in the process, although it was operated in total secrecy for five years until exposure by tech website the Verge, with even the city council totally unaware.
The tendency toward concealment may at least in part be attributable to public outcry over predictive policing programs, which exhibit seemingly invariable racial bias. Even algorithms that do not specifically use race as a metric have been found prone to this prejudice, due to the inclusion of ancillary variables such as socio-economic background, education, and location.
Still, significant light was shed on how authorities use Gotham, and what information it collates, in September 2020, when two Los Angeles Police Department training documents – ‘Intermediate Course’ and ‘Advance Course’ – used to instruct officers on the workings of the system were leaked.
The data collected on citizens – both law-abiding and those with criminal records, or suspected of having committed a crime, or even being connected in any way to individuals who have – includes sex, race, names, contact details, addresses, prior warrants, mugshots, surveillance photos, personal relationships, past and current employers, and even tattoos, scars, piercings and other identifying features.
According to an ‘LAPD Palantir Usage Metrics’ document, in excess of 5,000 officers – accounting for half the Department’s members – had access to Gotham in 2016, and in that year, they collectively ran around 60,000 searches through the system in support of over 10,000 cases.
Such a cutting-edge service doesn’t come cheap, with subscriptions running to millions of dollars annually. This sizable investment comes despite questions hanging over Palantir’s predictive policing effectiveness, which may suggest the software’s value to authorities doesn’t necessarily lie in its crime-fighting prowess.
Official figures indicate violent crime rates remained virtually unchanged in Los Angeles from 2009 to 2019, while aggravated assaults increased. However, non-violent crime did fall over the same period, in particular burglary and vehicle theft.
‘Without any safeguards
Troublingly, crucial portions of Palantir’s NHS contract are entirely redacted, including sections titled “limit of parties’ liability,”“authorised user groups,” and “data integration and analytics capability for self-service” – which covers how many “authorised users” are permitted to create and modify tools designed using the data, and the data sets involved.
In other words, the public presently has no way of knowing precisely what private information Palantir has been granted access to, how it will be used, and with who and what it can be shared with.
Shocking stuff indeed, yet the mainstream UK media and lawmakers alike have been almost entirely silent on what should at the very least be the subject of intense national debate. Amazingly, the company’s name has been mentioned a grand total of twice in parliamentary debates over the course of 2020, and only once in a critical context.
This wall of establishment silence stands in stark contrast to the US, where legislators – including Senator Elizabeth Warren – have prominently raised privacy concerns over Palantir’s participation in ‘HHS Protect’, a program launched by the Department of Health and Human Services to track the spread of coronavirus. Under its auspices, the company harvests data from a variety of federal, state and local government sources, healthcare facilities, colleges, and more.
“The inclusion of Protected Health Information in this database raises serious privacy concerns,” a coalition of Democratic senators wrote in July. “Neither HHS nor Palantir has publicly detailed what it plans to do with this, or what privacy safeguards have been put in place, if any. We are concerned that, without any safeguards, data in HHS Protect could be used by other federal agencies in unexpected, unregulated, and potentially harmful ways.”
While ministers claim life in the UK will begin returning to “normal” around Easter 2021, the length of the deal places Palantir in a position of immense and entirely unaccountable privilege at the heart of an institution which theoretically provides vital services to every British citizen, for the next two years and potentially beyond.
Readers may wish to ask themselves who or what their elected representatives are truly working for, and which interests they ultimately serve – or better yet, pose these queries to parliamentarians directly.
Think your friends would be interested? Share this story!
The statements, views and opinions expressed in this column are solely those of the author and do not necessarily represent those of RT.
© 2020, paradox. All rights reserved.