UK’s House of Lords questions experts over AI
The UK’s House of Lords announced in May a plan to divert its attention to technology’s modern role in policing methods and law enforcement, while acquiring guidance from legal experts to familiarize themselves with the topic.
Academics and senior officers met with the House of Lords and demanded answers from the members of UK’s higher chamber of parliament concerning Artificial Intelligence (AI) and Machine Learning (ML) in facial recognition.
Back in May, the Justice and Home Affair Committee revealed that it was set to conduct its first round of oral evidence in June with legal experts to acquaint with this progressed technology.
On Wednesday, Professor Michael Wooldridge – Head of Computer Science at the University of Oxford – was interviewed by peers.
Professor Wooldridge was successful in echoing some of the concerns his peers raised such as the standard case where a computer system can advise custody officers to detain someone or discharge them purely based on data which also involves their criminal history.
Wooldridge went to elaborate how as humans, we still need to expand of knowledge to fully understand how AI works, and what it is and isn’t capable of before we take any actions towards humanizing it.
“It’s not like human intelligence,” he said.
The argument he gave was that AI is implemented in governmental systems, there is a high chance it can misidentify photos, leading to momentous consequences in the criminal justice system, and specifically in facial recognition.
As members of the Lords began hearing oral evidence and arguments that support or oppose AI and ML, Professor Carole McCartney – Professor of Law and Criminal Justice – also voiced her fears by explaining that “one of the big criticisms of technologies is their lack of scientific basis.”
All this is taking place to further reinforce power of the fast expansion AI is witnessing; with governments considering and even implementing the technology into their system without conducting thorough and extended research on the matter, and what consequences it could lead to.
The expert meeting up does not come as a surprise, since members of the UK parliament have shown prior interest after London’s Metropolitan Police announced in a tweet its rejection to adopt facial recognition across the city.
For example, in December of last year, tech experts addressed facial recognition as a severe racist and gender-biased technology, leading to its role out from the London Met and other law enforcement stations across London.
In the U.S., the role this technology has was highlighted and investigated after the House Committee on the Judiciaryheard evidence about how it is used in law enforcement agencies due to lack of regulations.
Even though this technology has been met with rejection or opposition by some of the biggest legal experts, in the U.K., facial recognition is widely popular in some of the state’s biggest governmental sectors.
It is used in airports as ePassport border security gates since 2008, and there is possibility that it may replace check-in and passport checks at Heathrow Airport.