FATF: Challenges in Implementing AML/CFT New Technologies
The previous blog post presented the opportunities of adopting new AML/CFT technologies and how they can assist in reducing false positives, data enhancements, customer due diligence processes, and transaction/behavior monitoring. It is interesting too to look at what the institutions identified as challenges in implementing such technologies based on FATF’s report “Opportunities and Challenges of New Technologies for AML/CFT”. Here is a summary of the issues preventing institutions from adopting new tech more as per FATF’s report.
The effective use of new technologies for AML/CFT has two main crucial components: standardized data and feedback. When the systems are based on good quality data this makes it easier for developers to integrate into their tools; it is easier to interpret and explain to non-experts, as well as to communicate it to other stakeholders and competent authorities. With regard to feedback, if the financial intelligence units and competent authorities provide banks and other reporting entities with real money laundering cases, and suspicious activity, this would help the training of AI and machine learning systems. This would lead to better hit rates and more efficient detection and prevention.
Without further ado, let’s see the challenges identified within the response of reporting entities.
Challenge > Data Interpretability & Explainability
New technology is based on lots of data and brings up probably much more information than the entities are used to. Hence, a challenge can be the interpretability and explainability of all this new data.
If the entities cannot well explain and transparently implement the new solutions, this can undermine the ability to assess the AI’s accuracy in identifying suspicious activity. On other hand, the report says that “the issues with interpretability limit the ability to build trustworthy relationships between technology providers and users”. This creates mistrust that data processed through new technologies can be robust.
In response, the reporting entities indicate that they need additional guidance on how to interpret current regulations in the digital era, and need more engagement from the supervisors. Respondents to the FATF questionnaire indicated that “some supervisors are not as engaged as the private sector with the technology sector”. This leads to the lack of awareness of the supervisors on new trends and emerging data solutions. Further, their lack of specialist skills (and resources) and knowledge increases the challenge of interpretability.
Challenge > Data Harmonization (or lack thereof) & Ensuring Data Quality
What if your new tech solution lacks data harmonization? This new system would require fine-tuning more often, and certainly will come at a cost. Additionally, it may require adjustment to different jurisdictional requirements and formats. That’s why respondents mention the lack of harmonization makes impossible the technology use at scale. “This could potentially prevent innovation from reaching cost-effectiveness and hamper its development. […] Without scalability, some technological tools might not be financially feasible.”
“45% of the respondents of FATF’s questionnaire say data quality is seen as an obstacle to the adoption AML/CTF technology-based solutions.”
According to FATF’s report, machine learning tools rely on existing systems and their manual updating thus possibly generating instances where “bad data” is inputted and has a negative impact on the models adopted. This includes the data on which a machine learning system is trained on. If the training data includes false positives or other errors, these errors will be “trained-in” to the machine learning system, although some margin for error will still be needed for instances of human bias or unidentified errors (FATF, 2021, p. 41).
Challenge > Human Input and Capacity Building
AI and machine learning promise and have the potential to deliver lots of improvements, but the majority of them still include human review and input. As such, they are not replacements to the existing systems but are viewed as enhancements. Human input is mostly needed in elements that technology still cannot overcome, for example, regional inequalities or expertise on emerging issues.
Operational issues related to human capital are related to the ability of actors to understand and train staff to implement them.
Additionally, FATF’s report points that researchers are discovering that many AI algorithms replicate program developers’ conscious and unconscious biases and apply them at scale to unfairly target as suspicious financial activities of certain types of individuals or entities, or produce risk profiles and decisions that deny them access to certain financial products and services.
Challenge > Share information with counterparts / across borders
“To fully understand the nature and risk of suspicious transactions, actors require access to their full pathway which is often beyond borders or held by other entities.”
If the reporting entities are unable to share information with other parties or across borders, this is a challenge that would limit the effectiveness of new technology. Although this challenge stemming from regulatory issues currently exists, with or without new technology in AML/CFT, the new systems certainly provide an opportunity for improvement.
Challenge > Security and Protection from Criminal Interference
“Although it did not come up high in the response from the private sector, it may be more significant from public policy and law enforcement perspective.”
This challenge comes up in response to the growing number of criminal cases associated with the use of technologies related to identity fraud or criminal operations that use money mules.
The most frequently cited risks of digitalization in the FATF’s report is the abuse of the system by criminals and its contribution to increasing the vulnerability and financial exclusion of certain segments of society i.e. the elderly, rural, or distant communities.
Challenge > Risk of Unintended Consequences
Typically, new technology or implementation of new processes comes with risks of unintended consequences. As an example, the FATF report states that digital identity solutions that do not provide adequate risk-based technical assurance and appropriate governance present operational risks and potential unintended consequences and are also open to deliberate abuse. “When adopted without regard to the risk-based approach or proportionality, digital ID solutions may add to the exclusion of underserved communities.”
Challenge > Reluctance to Invest
To conclude with the challenges, the most common ones which we also hear most often regarding the adoption of new technology, highlighted in FATF’s report include:
> High costs
> Difficult to integrate new technologies with legacy systems
> Beyond the entity’s technical capacity to use appropriately and effectively
> The tech becomes outdated and requires additional investment in newer solutions
> It does not meet regulatory expectations or fail to satisfy a particular examiner (who may lack capacity to evaluate the solution’s effectiveness or is uncomfortable with innovative solutions)
> Present risks such as privacy violations and AML/CFT compliance failures
> Potential conflict with competing objectives such as privacy, inclusion, vulnerability to witting abuse
> Ethical and legal concerns of using AI
Then what can the anti-financial crime community do in order to support and promote wider adoption of new technologies?
Supervisors and international bodies such as the FATF have the ability to influence progress in the modernization of new systems and technology. The report recognizes that support from FATF and national competent authorities for innovation in AML/CFT is needed.
Additionally, respondents want guidance and interpretation of current regulations in the digital era. This would include also supervisors updating their own systems and supervisory strategies in order to better interpret and supervise AML/CFT in the digital age. The private sector could benefit from additional clarification on issues related to accountability, transparency, and the supervision of entities using new technologies.
The report also points out that:
“Wider implementation of technology will be possible if there are more significant incentives, either mandated use or a greater trust environment that supports investment and justify reform.”
The regulated entities would need to gain skills in explaining the technical details and remain responsible to the supervisors. It is important for regulated entities to continually examine the effectiveness of these new technologies to detect and combat ML/TF risks.
This will lead to:
> Regulated entities would be more outcome-oriented
> Adoption of new technology is fit for purposes & performs adequately
> Create feedback loop for both public and private sectors to re-calibrate technology-based solutions
> Help supervisors in their assessment of new technologies
Additionally, all actors should assess whether there are residual risks that may arise with the use of new technologies, or where there are key human elements that cannot be fully replaced by new technologies. Where residual risks are identified, regulated entities should demonstrate awareness of these risks and the ability to manage or respond to these when needed.
Finally, the role and consumer appetite for new technologies in financial services will become increasingly relevant as CDD and other individual-focused digital solutions become more prominent.
What do you think is needed to overcome real and perceived challenges in the implementation of new technology for AML/CFT?
FATF (2021), Opportunities and Challenges of New Technologies for AML/CFT, FATF, Paris, France, https://www.fatf-gafi.org/publications/ fatfrecommendations/documents/opportunities-challenges-new-technologies-aml-cft.html