Unfortunately, by prioritising ad-hoc incident resolution, organisations struggle to identify and address recurring data quality problems in a structural manner. So what is the correct approach? Boyke Baboelal, Strategic Solutions Director of Americas at Asset Control, answers the question for Finance Monthly.

To rectify the above issue, organisations must carry out more continuous analysis, targeted at understanding their data quality and reporting on it over time. Not many are doing this and that’s a problem. After all, if firms fail to track what was done historically, they will not know how often specific data items contained completeness or accuracy issues, nor how often mistakes are made, or how frequently quick bulk validations replace more thorough analysis.

To address this, organisations need to put in place a data quality framework. Indeed, the latest regulations and guidelines increasingly require them to establish and implement this.

That means identifying what the critical data elements are, what the risks and likely errors or gaps in that data are, and what data flows and controls are in place. By using such a framework, organisations can outline a policy that establishes a clear definition of data quality and its objectives and that documents the data governance approach, including processes and procedures; responsibilities and data ownership.

The framework will also help organisations establish the dimensions of data quality: that data should be accurate, complete, timely and appropriate, for instance. For all these areas, key performance indicators (KPIs) need to be implemented to enable the organisation to measure what data quality means, while risk indicators (KRIs) need to be implemented and monitored to ensure the organisation knows where its risks are and that it has effective controls to deal with them.

A data quality framework will inevitably be focused on the operational aspects of an organisation’s data quality efforts. To take data quality up a further level though, businesses can employ a data quality intelligence approach which enables them to achieve a much broader level of insight, analysis, reporting and alerts.

This will in turn allow the organisation to capture and store historical information about data quality, including how often an item was modified and how often data was erroneously flagged. More broadly, it will enable organisations to achieve critical analysis capabilities for these exceptions and any data issues arising, in addition to analysis capabilities for testing the effectiveness of key data controls and reporting capabilities for data quality KPIs, vendor and internal data source performance, control effectiveness and SLAs.

In short, data quality intelligence effectively forms a further layer on top of the operational data quality functionality provided by the framework, which helps to visualise what it has achieved, making sure that all data controls are effective, and that the organisation is achieving its KPIs and KRIs. Rather than being an operational tool, it is effectively a business intelligence solution, providing key insight into how the organisation is performing against its key data quality goals and targets. CEOs and chief risk officers (CROs) would potentially benefit from this functionality as would compliance and operational risk departments.

While the data quality framework helps deliver the operational aspects of an organisation’s data quality efforts, data quality intelligence gives key decision-makers and other stakeholders an insight into that approach, helping them measure its success and demonstrate the organisation is compliant with its own data quality policies and relevant industry regulations.

The financial services industry is starting to focus more on data quality. In Experian’s 2018 global data management benchmark report, 74% of financial institutions surveyed said they believed that data quality issues impact customer trust and perception and 86% saw data as an integral part of forming a business strategy.

Data quality matters. As Paul Malyon, Experian Data Quality’s Data Strategy Manager, puts it: “Simply put, if you capture poor quality data you will see poor quality results. Customer service, marketing and ultimately the bottom line will suffer.”

In financial services with its significant regulatory burden, the consequences of poor data quality are even more severe. And so, it is a timely moment for the rollout of the multi-layered approach outlined above, which brings a range of benefits, helping firms demonstrate the accuracy, completeness and timeliness of their data, which in turn helps them meet relevant regulatory requirements, and assess compliance with their own data quality objectives. There has never been a better time for financial services organisations to take the plunge and start getting their data quality processes up to scratch.