There’s a saying attributed to W. Edwards Deming, the father of the mid-1980s quality movement: “In God we trust. All others must bring data.” It can hardly be truer than when applied to the challenge of fighting financial crime and the relevance of accessing the right data at the right time. While outwardly simple, safely taking the right bits of data from your data warehouse (or lake) and then carefully examining them for insight brings with it many latent and deeply engrained challenges. If the task is done incorrectly, it means the wrong people are potentially on the loose and free to do bad things. If done properly, it can maintain—or even build—brand value.
Much of the challenge relates to the growth of the imperfect organization consisting of many different legal entities and different applications, acquired and patched together over the course of different mergers and acquisitions. Disposing of part of an organization can also result in compromises, especially where some core piece of data processing has been tactically routed through the departing organization as part of a short-term fix. Financial services is a case in point, which ironically sits at the center of the financial plumbing that moves money around the world. Much can be said for newer, simpler organizations, although they can suffer from a lack of experience, immature culture and a rush for growth, all of which potentially lead to cut corners.
For established organizations, the cloud offers an efficiency opportunity, but it doesn’t necessarily solve the data integrity challenge. Poorly organized data, ported from a legacy environment into the cloud, is not the answer. Using data to fight financial crime starts with the integrity and consistency of records across multiple systems, whether in the cloud or on legacy platforms. The maintenance of client static data might be the least glamorous of the data challenges, but having centralized control of a client’s static data change requests can go a very long way to understanding volume and consistency across different systems. Being able to measure consistency of client identifiers across different processes is the starting point, as it provides a metric to measure client data integrity. In developing this, some of the key questions that need to be asked are: How many systems have client data? What is the mapping and consistency across these different systems of the different client identifiers? What is the governance process for making change happen? Ownership by a senior leader, such as a Chief Data Officer (CDO), is essential, as is locking down systems in relation to changes that are made to client static data. The CDO sets the tone for data culture, with their management attention being essential. And the data from our survey appears to echo this, with only 8% of global respondents citing lack of senior management buy-in as a concern. The follow-through and discipline on these efforts, however, matter most.
An IT strategy needs a data strategy. Data should be stored once and only once. Typically, it is only when data is being moved or replicated across an organization that breaks within data integrity begin to arise. The data strategy needs to be supported by a data governance framework central to which is a data strategy steering committee. The committee should be staffed with experienced people who are equipped to ask the right questions and authorized to make decisions and request that resources are deployed as required.
More broadly, the data challenge can also flow from a lack of an IT strategy. IT strategies tend to be verbose and difficult to understand. At its simplest, an effective organization-wide IT strategy should categorize systems into one of three categories: retire, maintain, or invest. The organization should be posing questions as to which systems are strategic, which are for retiring and which are in a holding pattern. The decision to keep or retire a system is a function of cost, break/fix performance, data utility, functionality and alignment to strategic hardware and software standards. Unfortunately, decisions take time, and this is a complex area, which is often overlooked, to the organization’s cost.
The data challenge is also not uniquely internal. Financial services are highly regulated, with intersecting Venn diagrams of requirements that need to be met combined with regulators demanding high degrees of compliance. Much of this external data covers fraud monitoring, customer ID and verification databases, anti-money laundering transaction monitoring (AML TM), sanctions screening, digital identification systems, politically exposed person (PEP) enforcement and adverse media screening. Our survey results show the extent to which external data sources are used, with fraud monitoring and customer identification ranking highest at 91% and 87% respectively. Changes made in these data sets must be tracked and governed and will also require the involvement of supplier management teams and will be subject to some level of regulatory oversight.
The Importance of Data Hygiene
Good data practice is about fostering a positive culture of data stewardship. Recognizing the importance of the data narrative and understanding the flow of data across the organization’s technology and how it ties to regulatory oversight and operational risk is essential. This is reflected in our survey results, with more than two-thirds of respondents planning to invest in technology and 60% expecting to increase their cybersecurity budgets to improve their data perimeter. Ironically, most organizations are driven by quarterly sales targets and not data, albeit the latter is essential to support the former. Data issues only ever arise when there is a mistake, an audit or a change event. But bad data goes to the bottom line, either in the form of operational errors and the cost of restoring them, regulatory capital, or fines. Good data practices come from the top of the organization, with a zero-tolerance approach to bad data similar to the way that the Six Sigma quality movement drove out manufacturing errors. The challenge is cultural, requiring strong data stewardship that is visible and transparent. Organizations are generally more comfortable rewarding performance for meeting sales targets or customer satisfaction than recognizing the team with the cleanest static data.
All of this will become even more pressing with the advent of AI, all of which is driven by data. Encouragingly, despite a cautious, and in some cases even alarmist, commentary emerging in the media, perceptions toward AI as part of the financial crime monitoring process are overwhelmingly positive per our survey. The challenge is well suited. Fifty-six percent of respondents reported that some form of AI had been implemented into financial crime compliance programs, albeit recognizing that AI is still relatively new in the majority of these cases. Whether this starting point is a baseline trend will take time to emerge, but with more than one in two reporting some level of AI implementation, this is a data point that cannot be ignored.
In summary, what was most encouraging from the survey was that at least two-thirds of respondents were using technology to screen for customer regulatory actions, law enforcement and sanctions. But that still leaves a gap of one in three which presents a latent risk which could be avoided by supporting technology. Technology gives us access to data, but data discipline is essential to manage risk and run an organization. Rarely is data a glamorous topic, but much rests on it. The truth really is out there; you just need the data to find it.