Firms Need a License to Innovate: Artificial Intelligence and Machine Learning in Financial Crime Compliance | AFME


Share this page
Close
Views from AFME
Firms Need a License to Innovate: Artificial Intelligence and Machine Learning in Financial Crime Compliance
29 Mar 2023
Download Links
Author Perez Adelaja <p>Graduate, Technology and Operations</p>
​ ​

Artificial Intelligence and Machine Learning (AI/ML) have been in the news again with the release of OpenAI’s chat bot, ChatGPT. This tool uses elements of artificial intelligence to communicate with people in a human-like way and has fascinated the public with its swift and detailed responses and potentially wide-ranging uses.

However, ChatGPT is merely the latest public development in the long-established field of AI/ML, which the financial sector has been exploring for numerous years. AI/ML has the potential to transform the financial industry through rapid analytical tools and enhanced data processing capacity. For example, huge volumes of data can be analysed more efficiently to improve trading strategies or to optimise capital models.

Financial regulatory authorities around the globe have equally been paying close attention to AI/ML, as they acknowledge it has many conceivable uses within the sector. The EU is currently developing an AI Act, while the UK has released a discussion paper on AI/ML, to which AFME responded.  A key message was the need for more dialogue with supervisors on areas where firms may find it more challenging to innovate, such as financial crime compliance. This is an area with huge potential for the use of AL/ML, although barriers still exist to its deployment.

The Potential for AI/ML in Financial Crime Compliance

AI/ML could prove especially transformative in the prevention and detection of financial crime, including anti-money laundering and combatting the financing of terrorism (AML-CFT). By taking advantage of the advances in data analytics derived from AI/ML, financial institutions can more effectively use their client, communications and transactional data created by their products and services.

Financial institutions’ surveillance systems could benefit from more sophisticated incorporation of unstructured data into datasets (for example, to contextualise transaction data with current affairs) or better oversight of internal and client communications using natural language processing capabilities to detect misconduct. The incorporation of AI/ML will also aid in improving the accuracy and relevance of the alerts that are generated by these surveillance systems, for both financial institutions and their supervisors. For instance, Europol estimates that currently just 10% of suspicious transaction reports submitted lead to further investigation by competent authorities.

Barriers to Deployment

Integrating AI/ML solutions into financial crime compliance, however, is not simple. To ensure a suitable baseline standard, the relevant regulatory frameworks tend to be rule-based, rather than principles-based. For example, these frameworks mandate analysis by specific risk indicators or variables, with minimal scope for a highly tailored approach or innovative technological solutions.

A key benefit of AI/ML as a technology, on the other hand, is its ability to find new solutions for the task it is set, incorporating different data sources and finding new patterns. Given the mismatch between regulatory requirements and the adaptive nature of AI, FIs interested in implementing AI/ML applications in financial crime compliance therefore cannot use them to retire their current systems, even where a new approach can produce more effective results.  

In addition, where FIs determine that they can deploy AI/ML in financial crime compliance, they must ensure that they are able to provide sufficient transparency on the applications’ outputs. This is to satisfy both themselves and their supervisors that the application is performing to a suitably high standard of accuracy and efficiency, while not contravening FIs’ ongoing obligations in areas such as data and client protection. This will need to be assessed throughout the lifecycle of an application and will involve upskilling both for FIs' compliance teams and their supervisors, to ensure that the right level of challenge is present. 

Steps in the Right Direction

Despite these challenges, the industry is keen to explore how AI/ML can be incorporated to fight against financial crime, while continuing to meet regulatory expectations.

In December 2022, the Wolfsberg Group of FIs released their “Principles for Using Artificial Intelligence and Machine Learning in Financial Crime Compliance” to support the wider industry in their technological innovation in this field. These serve as a helpful guideline for how FIs should apply AI into financial crime compliance.

Globally, regulators have also begun to recognise the significance of AI/ML in financial crime compliance and the importance of supporting its development. Notably, the Monetary Authority of Singapore (MAS), announced in October 2021 that it will introduce a digital platform and enabling regulatory framework for financial institutions to share with one another relevant information on customers and transactions to prevent money laundering, terrorism financing and proliferation financing. US regulators have also spoken about the potential improvement of AML-CFT through the use of technologies such as AI/ML.

In the UK, the FCA released a 2019 report on Machine Learning in Financial Services, which explored financial crime as one of its case studies. In 2020, the French ACPR included AML-CFT as one of its series of technical workshops on how AI can be developed within the sector. The ECB has also been considering the challenge of digitisation, including AI/ML, in the context of its new European AML framework.

Conclusion

It is encouraging to witness multiple regulators believing in the potential of AI/ML in preventing and detecting financial crime. However, further collaboration with authorities will be required to ensure that innovation in this field can be supported within the context of continuing to meet regulatory expectations – and, as recently noted by the Financial Action Task Force, even statements of regulatory support for the adoption of new technologies do not always translate into real supervisory acceptance of compliance practices and new procedures. To quote the Director of the US Financial Crimes Enforcement Network: “innovation will only happen if the private sector feels it has latitude to innovate. AFME would welcome further discussions with authorities in this area and looks forward to supporting the development of AI/ML in the fight against financial crime.

References