News Hub

FCA chief cautions banks on the risks of AI and big tech

Written by Thu 13 Jul 2023

Banks have been warned about the risks of AI and big tech by Financial Conduct Authority (FCA) chief. Nikhil Rathi highlighted the need to invest in solutions to protect against the hazards of AI and ‘deep fake’ fraud.

In a speech by Rathi, he urged the financial sector to be more productive and warned about the hindrances surrounding the use of automatic trading robots in financial markets, as well as the poor results that occur when using biased datasets.

The FCA chief acknowledged the potential of AI, citing a study by the US National Bureau of Economic Research, which found that productivity was boosted by 14% when over 5,000 customer support agents used an AI conversational tool. However, Rathi raised concerns regarding its increasing global use.

“The use of AI can both benefit markets and can also cause imbalances and risks that affect the integrity, price discovery and transparency and fairness of markets if unleashed unfettered,” said Rathi.

Additional risks from AI include its capability to manipulate audio and video to influence consumers. He cited a ‘deepfake’ video of personal finance campaigner Martin Lewis as an example of this technology.

Bosses have been warned that they will be held accountable for AI and the decisions they take at their firm.

“As AI is further adopted, the investment in fraud prevention and operational and cyber resilience will have to accelerate at the same time. We will take a robust line on this – full support for beneficial innovation alongside proportionate protections,” added Rathi.

The FCA chief’s words echo ambitions of the UK Government to become a global AI regulation hub.

“The Prime Minister said he wants to make the UK the home of global AI safety regulation … we stand ready to make this a reality for financial services,” said Rathi.

Big tech dangers

Rathi calls for further input on the role of ‘big tech’ as gatekeepers of data, in addition to risks that it may pose to operational resilience in payments, retail services, and financial infrastructure.

“We are mindful of the risk that big tech could pose in manipulating consumer behavioural biases,” said the FCA chief.

The partnership opportunities that big tech can offer were highlighted, but with the powers of larger firms becoming more rooted, concerns have been raised around the risks to normal financial market functioning.

Rathi stressed any data held by financial services firms could not be equalled to data sets owned by big tech firms, including browsing data, biometrics, and social media coupled with anonymised financial transaction data.

Responsibility and regulations

The FCA’s work with the Bank of England and PRA were highlighted in the speech. With nearly two-thirds of UK firms using the same few cloud service providers, accountability and regulations were recognised as two vital components in setting the standards for service.

“We must be clear where responsibility lies when things go wrong … we will therefore be regulating these Critical Third Parties, setting standards for their services, including AI services, to the UK financial sector. That also means making sure they meet those standards and ensuring resilience,” said Rathi.

The existing Senior Managers & Certification Regime Framework and Consumer Duty, due to come into force this month, are examples of actions established by FCA to tackle these concerns.

Rathi assured that the Framework provides clear processes to follow in response to AI innovations, while the Consumer Duty regulations set higher and clearer standard for consumer protection.

Hungry for more tech news?

Sign up for your weekly tech briefings!

Get seen at Big Data & AI World

6-7 March 2024, ExCeL London

Be at the forefront of change with thousands of technologists, data specialists, and AI pioneers.

Don’t miss the biggest opportunities to advance your business into the future.

Written by Thu 13 Jul 2023

Send us a correction Send us a news tip