Published on 31.03.2025

Effectively governing the use of AI

AI

An Introduction to AI Governance

Since generative AI’s introduction to the world stage in November 2022, AI and governing AI have become central topics to conversations around the future of business, operational efficiency and costs cutting. This is no different in the Financial Services sector, where AI tools have been introduced in the areas of operations and compliance, including fraud detection and transaction monitoring, and even within strategic decision-making. According to the Bank of England and FCA survey on AI use, 75 percent of financial services firms are using artificial intelligence tools in some capacity, with another 10 percent having plans to do so in the next three years. This has raised questions around adverse outcomes from AI use across the industry and accountability that must be grappled with. Therefore, firms must consider the steps needed to provide assurance to the market and its customers that AI is being implemented in an effective but safe manner, with appropriate governance frameworks set in place.

 

AI tool usage within financial services and its associated challenges

 

Operations and Compliance

One use case which has already seen considerable interest in the financial services sector is in operations and compliance, where firms have begun utilising AI tools to streamline transaction monitoring and fraud detection by utilising AI pattern recognition. This involves the identification through behavioural analytics and clustering of subtle patterns that link fraudulent activities with one another.

Operations & Compliance Challenges

Despite the fact that these tools are being used in what is, ultimately, a human control environment, increasing reliance is being placed on what are black box AI systems. These systems, which produce outputs in a way which is not strictly explainable due to the complex ‘neural networks’ that use a hidden layer of reasoning to reach a certain output, raise considerable questions around AI governance. In particular, where those charged with governance and/or the Senior Management under the Financial Conduct Authority’s SM&CR are required to attest to the strengths of their control environments, how can they be confident that black box controls are working as expected? SMFs and similar are not expected to be technical experts in AI technology, however they are expected to gain (and provide) confidence that the control environment is as robust as they would have in the pre-AI world.

These challenges are compounded by issues with data quality, and the potential for AI tools to produce biased or inaccurate outcomes when data is not adequately scrutinised. An AI tool is only as good as its data, so it is vital that firms have in place the controls and oversight mechanisms in place to mitigate the risk of adverse outcomes from bad data.

 

Decision-Making

A high-impact way that AI tools are being utilised in the financial services sector is in strategic decision-making.

Due to the harmful impact that adverse AI outcomes can have, their use on a business-wide level raises significant AI governance challenges. While we are yet to see AI as a decision-maker in its own right, we have come a long way since Deep Knowledge Ventures assigned its AI, Vital, to its board in a move that amounted to little more than a publicity stunt in 2014. AI is now being used to provide predictive analytics to guide investment decisions, leveraging unique data sources on a scale previously impossible.

Decision Making Challenges

While AI has the potential to drastically improve business outcomes across the sector, it is vital that Senior Managers are able to justify decisions made and maintain sufficient oversight to prevent adverse outcomes. Should firms fail to do so, the possibility is raised that bias, accountability etc. move from being bugs of a certain section of a financial services company to being the feature of long-term business-wide strategic decisions.

 

How Can Novatus Help?

It is therefore vital for senior management at financial services firms to undertake a broad framework review to ensure that the introduction of AI does not come at the expense of adequate controls. At Novatus, we can help you with:

      • The development of an effective AI governance framework to maintain accountability and ensure that the risk of adverse AI outcomes is mitigated.
      • Undertaking periodic internal control reviews to ensure that you are up to date with the fast-moving changes of the Artificial Intelligence industry.

Contact us today or email john.gillam@novatus.global to learn more about Novatus’ AI Governance offering and how we can help you ensure that you adequately mitigate the risk of adverse AI outcomes.