The parliamentary Treasury Committee said that the Financial Conduct Authority must publish practical guidance on AI by the end of 2026. This should include guidance on how consumer protection rules apply to the use of AI and who in an organization is accountable if technology causes harm.
Regulators "are exposing the public and the financial system to potentially serious harm due to their current positions on the use of artificial intelligence in financial service," the committee wrote in its report.
The Commons committee, which scrutinizes the policy of HM Treasury and Britain's financial regulators, said that the Bank of England and the FCA must conduct stress-testing at firms over their use of AI to help protect the sector from market shocks linked to machine learning.
The cross-party group of MPs said that 75% of Britain's financial services companies are using AI, with the biggest take-up among insurers and international banks. They are using the technology to automate administrative functions and to deliver core services such as processing insurance claims and credit assessments.
Financial regulators have so far adopted a wait-and-see approach to regulating the technology, the committee said. This approach means that they are not doing enough to manage the risks that come with financial institutions becoming increasingly reliant on AI.
The U.K. does not currently have legislation to regulate AI in the financial sector: watchdogs rely on existing regulations to help them supervise against risks arising from the new technology.
"The use of AI in the City has quickly become widespread," Meg Hillier, the Labour MP who chairs the committee, said. "Based on the evidence I've seen, I do not feel confident that our financial system is prepared if there was a major AI-related incident, and that is worrying.
"I want to see our public financial institutions take a more proactive approach to protecting us against that risk," Hillier added.
The committee also recommended other steps. The lawmakers said that the government should designate AI and cloud providers that work with the financial services sector under the so-called critical third parties regime. The rules were established in 2025 to give the FCA and BoE new powers to investigate non-financial companies that provide critical services to the U.K.'s financial sector.
The Treasury Committee launched an inquiry into the risks and opportunities that AI presents to the financial services sector in February 2025. It said it received 84 written submissions to its call for evidence and correspondence from six major AI and cloud providers.
--Editing by Ed Harris.
For a reprint of this article, please contact reprints@law360.com.