Banks and fintech are collaborating together aiming to combat deepfake video and audio content generated by fraudsters. This type of fraudulent activity takes image manipulation to the next level by using Artificial Intelligence (AI) to produce fake audio and videos. Face-swap videos and voice cloning are the best-known types of deepfake which are used to inflict harm to individuals, financial institutions, and banks.
According to a new study conducted by the University College London, fake audio and video content rank in the top 20 ways artificial intelligence can be used for crime. On the other hand, DeepTrace Labs reported nearly 15000 deepfake videos online last October. Subbarao Kambhampati, a professor of computer science at Arizona State University, told the New York Times “In the longer term, I think it will be impossible to distinguish between the real pictures and the fake pictures.”
HSBC, a British multinational investment bank and financial services holding company, signed up to a biometric system developed by the technology firm Mitek, according to the Financial Times. In partnership with Adobe, HSBC will be able to check the identity of new customers using live images and electronic signature.
Blackberry said that the Covid-19 pandemic has exposed people to impersonation frauds. Due to the lockdown, there was less face-to-face contact and workers have been exposed to deepfake asking them to authorize payments.
Back in 2019, an employee of a UK-based energy firm believed he was talking to the Chief Executive of his company and followed his orders to transfer a $243000 to a Hungarian Supplier, according to The Wall Street Journal. Unfortunately, he was talking to a scam artist who impersonated the CEO using a voice-altering Al tool.
On September 3, 2020, a new security center was opened by British Fintech iProov. Based in Singapore, the Center aims to detect and block deepfake videos that are used to impersonate clients. Rabobank, ING, and Aegon use this technology to ensure they are dealing with real people and not with manipulated records. Andrew Bud, iProov founder, and CEO said, “It’s likely that so few organizations have taken such action because they’re unaware of how quickly this technology is evolving. The latest deepfakes are so good they will convince most people and systems, and they’re only going to become more realistic”, according to Computer Weekly.
Recently, Microsoft has deployed a new tool – Microsoft Video Authenticator – that will spot deepfake manipulated by Artificial Intelligence, according to ETTelecom.