AI in financial services: a turning point for regulators and firms

London parliament Recordsure

Commenting on the recent UK Parliament – Treasury Committee publication of the Artificial Intelligence in Financial Services report, Joe Norburn, CEO at TCC Group (TCCMomenta and Recordsure), highlighted the growing regulatory and operational challenge posed by AI.  

 

“The Treasury Select Committee’s latest report on AI in financial services delivers a stark warning to regulators and firms. AI is already shaping decisions across credit, insurance and customer services, yet opaque decision-making, the risk of excluding vulnerable consumers, increased fraud and unregulated advice from AI chatbots expose a widening gap in regulatory oversight. What the report exposes is a growing tension inside firms,” Norburn said. 

 

The UK Parliament Committee’s report echoes this concern. It emphasises that more than three-quarters of UK financial services firms are now using AI, especially in areas such as insurance claims processing and credit assessments, and that the current “wait-and-see” regulatory stance may not be sufficient to manage the associated risks.  

 

According to the Treasury Committee, while AI offers significant benefits, such as improved efficiency and faster customer service, the risks to consumers and market stability are real. Evidence submitted to MPs underscored worries about the lack of transparency in AI-driven decisions, the potential for financial exclusion of vulnerable customers, the rise of fraud, and risks to consumer protection.  

 

Norburn underlines that the report exposed a growing tension inside firms: “AI is evolving quickly, often embedded deep within operational processes, while regulatory expectations remain fragmented and, in places, ambiguous.” 

  

A passive approach that once felt pragmatic is now seen as untenable. A “wait-and-see” approach might have once felt pragmatic, but the Committee is right to say it is no longer tenable. Without decisive action, AI-driven decision-making risks amplifying bias, weakening consumer protection and creating new sources of systemic shock. Where accountability is unclear, responsibility will inevitably fall through the gaps,” he said. 

 

The Committee’s recommendations aim to address these concerns and represent a turning point in how AI’s role in finance is governed. The Bank of England and the Financial Conduct Authority are being told to provide comprehensive, practical guidance on how existing rules apply to AI by the end of 2026, strengthen accountability under the Senior Managers and Certification Regime, and introduce AI-specific stress testing to prepare for market disruption. Just as importantly, the HM Treasury is urged to bring major AI and cloud providers into the Critical Third Parties regime to address growing concentration risk. 

A timely reminder that responsible AI isn’t just about innovation, it’s about governance, transparency and trust.

“The message is simple: responsible AI is no longer optional,” Norburn said. “It is fast becoming a core conduct and operational resilience concern. Innovation could and should deliver better outcomes for consumers, but only where governance, transparency and accountability are built in from the outset.” 

 

Firms that act now to strengthen governance frameworks, clarify accountability for AI outcomes, and align with emerging regulatory expectations will be better placed to navigate the evolving landscape. As AI continues to become more embedded in financial decision making, thoughtful implementation – supported by robust oversight – will be key to realising the opportunities AI offers while protecting consumers and market integrity. 

 

At Recordsure, we work to the highest AI principles to ensure safety, security, ethical responsibility and accountability – helping firms adopt AI with confidence in regulated environments.  

Get in touch to learn how we support financial services firms with responsible AI.

Ready to get started?

Book a demo with us to experience the power of ReviewAI in action.