In 2026 Britain decided to expand police facial recognition in the country after selecting Israeli surveillance technology software Corsight AI, that was used in Gaza, pairing it with deepening UK contracts with Palantir, leading to questions about civil liberties, racial bias, data sovereignty and how battlefield technologies migrate into policing.
The Home Office’s decision to dramatically scale live facial recognition (LFR) marks a turning point in Britain’s AI surveillance architecture.
UK police surveillance fleets are set to grow from 10 to more than 50 vans nationwide, powered by AI supplied by Digital Barriers and its Israeli subcontractor, Corsight AI. Campaigners warn them to move imports tools tested in war zones into everyday public life.
From Gaza Checkpoints to British Streets
Corsight’s software has been linked to the military Israeli surveillance technology in Gaza, where it was reportedly deployed by Unit 8200 to identify Palestinians at checkpoints. Israeli officials later raised concerns about its accuracy after “hundreds” were wrongfully arrested and detained, according to reporting cited by the New York Times.
Despite this record, the UK Israel tech hub selected Digital Barriers and Corsight following a six-month Essex police trial, as part of a £20 million rollout. Civil liberties groups fear misidentifications could disproportionately affect ethnic minorities already being surrounded by the police in Britain.
Essex police previously refused to confirm whether officers met Corsight representatives, saying the request would exceed cost and time limits, according to Action on Armed Violence (AOAV). Corsight’s leadership includes former Israeli intelligence and security officials, deepening concerns about how battlefield technologies are repurposed for civilian use.
The “bromance” relationship of UK Israel tech hub is built on moving the surveillance tools from conflict zones into domestic areas. Technologies used on Palestinians a population subjected to intense, racialized surveillance monitoring risk carrying embedded biases into UK policing.
In a diverse society like the UK, such systems could amplify discriminatory outcomes against Black, Arab, Muslim and other minority communities, normalizing algorithmic suspicion under the mask of security.
Palantir, Power and Data Lock-in
On the other hand, to the Israeli surveillance technology UK police facial recognition rollout, the the country has tightened its embrace of Palantir, awarding the US firm a $331 million (£240 million) Ministry of Defence contract and embedding its software across National Health Service (NHS) data systems.
Palantir has pledged $2.07 billion (£1.5 billion) to “boost military AI” and “transform lethality in the battlefield,” language that has unsettled critics in the UK Israel tech hub.
Medical and civil society groups argue Palantir’s involvement undermines trust. The British Ministry of Defence (MoD) passed a 2025 resolution calling the company an “unacceptable choice of partner” for the NHS.
British Medical Association (BMA) deputy chair, Dr David Wrigley, said, “If Palantir’s software is being used to target individuals in immigration enforcement and is being deployed in active conflict zones, then that’s completely incompatible with the values we uphold in the delivery of care.”
David Nicholl of Doctors’ Association UK (DAUK) told Middle East Eye that, “one of DAUK’s key missions is patient safety, and the involvement of Palantir in the NHS is having an adverse effect on patient trust in how their data is handled”.
Rachael Maskell, the Labour MP for York Central, told The Guardian in December 2025 said the government “needs to undertake transparent due diligence,” while Clive Lewis argued the state should “stay very far away” from Palantir. Swiss military experts previously rejected the firm over fears US intelligence using AI surveillance could access sensitive data a risk British officials insist is mitigated by contractual controls.
Palantir’s UK chief, Louis Mosley, rejects claims of lock in or ethical conflict, accusing critics of putting “ideology over patient interest” and arguing its Israeli surveillance technology tools are essential amid rising security threats.
Yet critics oppose that dependence itself is the danger.
“Every win for Palantir is a loss for British sovereignty,” said CEO of British AI firm Faculty, warning that once embedded, Marc Warner, withdrawal becomes costly and disruptive.
As Britain accelerates its adoption of powerful Israeli surveillance technology, the main question remains unresolved, whether AI surveillance tools used in war and border control can be safely diverted in a democracy or whether they will quietly reshape policing, racial bias, healthcare and civil rights along far harsher lines.
Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.