Microsites

What is Artificial Intelligence?

A recent Financial Conduct Authority (FCA) discussion paper, DP22/4: Artificial Intelligence, offered the following definition of Artificial Intelligence (AI):

‘It is generally accepted that AI is the simulation of human intelligence by machines, including the use of computer systems, which have the ability to perform tasks that demonstrate learning, decision-making, problem solving, and other tasks which previously required human intelligence. Machine learning is a sub-branch of AI.

AI, a branch of computer science, is complex and evolving in terms of its precise definition. It is broadly seen as part of a spectrum of computational and mathematical methodologies that include innovative data analytics and data modelling techniques.’

Many PIMFA members are already using AI systems and tools in their everyday operations, and it is likely that the adoption of AI in financial services will increase rapidly over the next few years.

PIMFA firms already use AI tools for several purposes. For example, they analyse large volumes of data quickly, easily, and accurately, which enables their employees to spend more time working with and for their clients.

There are concerns that as AI becomes more advanced, it could introduce new risks, for example:

system develops biases in its decision making

leading firms to make bad decisions

rESULTING IN Poor outcomes for their clients

This is why it is essential that firms deploying AI systems have a suitably robust control framework around their AI components to keep a careful check on what they are doing.

As with any innovation, AI has the potential to make fundamental and far ranging improvements in how firms can serve their clients. However, we must ensure it is continually monitored and checked regularly to manage the risk and maximise the benefit.

CLICK IMAGE TO ENLARGE

A number of government departments are asking regulators such as the Financial Conduct Authority (FCA), Bank of England (BoE), Information Commissioners Office (ICO) and Competition and Markets Authority (CMA) to publish an update on their strategic approach to AI and the steps they are taking according to the White Paper. The Secretary of State is asking for this update by 30 April 2024.

On 13 March 2024, the EU Parliament approved the EU Artificial Intelligence Act. The EU AI Act sets out a comprehensive legal framework governing AI, establishing EU-wide rules on data quality, transparency, human oversight and accountability. It features some challenging requirements, has a broad extraterritorial effect and potentially huge fines for non-compliance.

CLICK IMAGE TO ENLARGE
CLICK IMAGE TO ENLARGE

Click to expand.

Click to expand.

Financial Services Skills Commission call for evidence: The impact of AI and disruptive technology

The Financial Services Skills Commission (FSSC) has published a call for evidence (CfE) on how AI and disruptive technology will change financial services, as well as its workforce, over the next five to ten years.

The FSSC will analyse and recommend how skills can drive growth and productivity by supporting the adoption and innovation of AI in financial services.

The CfE closes on 26 November 2025 and the research phase of this project will include publication of an interim report in 2026.

Read more details here and access the CfE here.

DSIT Call for Evidence: AI Growth Lab

The Department for Science, Innovation & Technology (DSIT) has launched a call for evidence on the proposed AI Growth Lab, designed to support responsible AI innovation through targeted regulatory modifications under robust safeguards. The Lab would provide a cross‑economy sandbox to enable deployment of AI‑enabled products and services that current regulation may restrict.

Working with sectoral regulators, the Lab could run issue‑specific sandboxes, grant time‑limited exemptions for eligible firms, and make recommendations on converting successful pilots into regulatory reforms. DSIT is seeking views on two potential models – a single, centrally operated lab or regulator‑led labs. 

Responses are due by 2 January 2026. For more information, please click here.

The AI Advantage: How to Revolutionise Business Growth with Reverification

Read here an article from the PIMFA Journal #32 by Alexander Blayney, Global Partnerships & Enterprise Sales at Id-pal, looking at how, with the increasing prevalence of identity fraud, AI can help revolutionise business growth when applied to identity reverification

HM Treasury: Artificial Intelligence and Cybersecurity – Navigating Risk and Resilience in the Financial System

HM Treasury has published a statement on Artificial Intelligence and Cybersecurity by the G7 Cyber Expert Group (CEG) that advises on cybersecurity policy issues and proactively addresses the emerging and evolving cybersecurity risks AI may pose.

Key areas covered in the CEG statement include:

  • Illustrating the Cyber Impact of AI
  • Maximising Opportunities While Managing Risks
  • Financial Sector Considerations
  • Key Considerations for Financial Institutions and Authorities

Read the statement here.

PIMFA Compliance Conference 2025

Kevin Sloane, Senior Policy Advisor, PIMFA with his panel members Marina Reason, Partner, Financial Services Regulation, Herbert Smith Freehills Kramer LLP,Becky Thompson, Manager, Continuous Improvement, Redmayne Bentley and Richard Preece, Director, DA Resilience. Speaking about Leveraging Artificial Intelligence to meet your compliance challenges at the 2025 PIMFA Compliance Conference

FCA Speech: Regulating for growth – the future is now

The FCA has published a speech by Jessica Rusu, FCA Chief Data, Information and Intelligence Officer.

The speech notes that innovation and technology are central to the FCA’s strategy to support growth and competitiveness. 

Topics covered include:

  • The future of fintech,
  • Crypto regulation,
  • AI live testing.

Read the full speech here.

Almost there...

Complete the quick form below to download the Membership Brochure