Finance Firms Need to Focus on Data Fundamentals in 2021

Amid increasing data regulation and financial strain, ambitious firms must refocus their attention on the twin pillars of data quality and data governance.

Finance Monthly hears from Wayne Parslow, Executive Vice President for EMEA at Validity, as he explores what the financial services sector stands to gain from better handling of its data.

Financial firms face an increasingly complex minefield of regulations when it comes to handling data. The sector has so many acronyms that it’s often difficult for a layperson to wrap their head around them. Unfortunately, finance companies don’t fare that much better, and can be overwhelmed by seemingly infinite customer data management requirements.

Whether it’s ensuring appropriate customer data storage under GDPR or securing payments processes under PSD2 and PCI-DSS, there’s a host of regulatory pressures for managing the financial customer relationship chain.

Regulatory bodies are certainly not toothless when it comes to enforcing punitive measures, either. At the end of 2020, the ICO issued fines to both OSL Financial Consultancy Limited and Pownall Marketing Limited for misusing personal data.

Data Management Difficulties

Ensuring data held by finance firms is accurate, up to date and, equally importantly, used appropriately is a shared goal for both the regulator and financial institutions. However, with the pressures put on financial firms by the pandemic, there’s a good chance that data management best practice has taken a back seat in favour of ensuring business continuity.

This is a misstep, as the two key fundamentals of data – data quality and data governance – should be tied into the basic operations of a financial services firm. With strong data foundations, financial services firms will be in a far stronger position to navigate the upcoming uncertainty of a post-pandemic world.

Ensuring data held by finance firms is accurate, up to date and, equally importantly, used appropriately is a shared goal for both the regulator and financial institutions.

Having data quality and governance work in concert to support one another does not simply ensure regulatory compliance, though. The value of data for driving successful business outcomes has already been proven, and businesses which employ a data-driven strategy are growing 30% year-on-year. Higher data quality also delivers stronger customer relationships and greater engagement.

Curating Quality

Data quality is not a once and done operation. For financial services in particular, it’s a complex, continuous network of processes and actions that must be continuously maintained as new data is collected, augmented and edited by the organisation.

First and foremost, a finance firm must take stock of the current state of its data. Given the rapid changes that have occurred over the past year, it’s essential to reassess data for accuracy, completeness, duplicates and inconsistencies. Firstly, data needs to be housed correctly so that it can be profiled accurately. Profiling their data enables financial organisations to ensure it is right for the business’s current needs, can be easily analysed and reported on, as well as being able to more easily check whether it is up to date.

Deduplication

A common barrier to data quality are duplicates. Many regulations require data to be up to date, and for customer data to be removed under certain circumstances (i.e. when a contract is terminated). Whilst a firm might believe it has done its due diligence under these circumstances, leaving duplicate data behind poses a significant compliance threat and risks inappropriate or even illegal communication. To have a consistent, complete view of its customer data, a financial firm must be proactive with the management of deduplication. It’s a simple yet effective process that can make a huge impact, but requires an investment in the appropriate tools.

Leaving duplicate data behind poses a significant compliance threat and risks inappropriate or even illegal communication.

Security and Enhancing Data

The end user is typically identified as the weakest link in the security chain, and many breaches reported to the ICO stem from simple user error, whereby an employee downloads a confidential document to a laptop which is then lost or stolen, for example.

With the move to remote working last year, many businesses wisely took the step to upskill their now remote workforces with additional security best practice training to help mitigate the additional cybersecurity risks.

Organisations can take additional steps to ensure errors that create vulnerabilities, such as the laptop example above. Employees will often adopt methods that help them get their jobs done most efficiently, even if these deviate from security best practice. Standardising data is a crucial step to enabling it to move through the organisation in the correct, and secure, way – regardless of location.

For example, if finance needs to produce reports based on the outgoings of a few different international teams, putting best practice standards in place as basic as how titles and regions are entered means this can be completed more efficiently, easily and securely across the board.

Alongside profiling, deduplication and process standardisation, verification needs to be a top priority, and should take place as data is collected. Using external sources, both prospect and existing client data should be verified (provided, of course, that consent has been given for these external sources to be used in this way). Enriching data in this way ensures finance firms get a better ROI from marketing and sales.

Adopting a Data Mindset

Data is constantly changing, and a continuous monitoring regime is the only way to keep track as it waxes and wanes. A simple way to keep up with the health of your data as it changes is to set up dashboards and alerts that track data quality automatically.

That said, it’s not just about technology. There’s no getting away from it – a comprehensive cross-functional approach is needed to implement a successful data governance programme. For finance firms, team members must be subject matter experts who understand the complex industry standards and regulations and know what to do if they don’t. Many finance organisations will already have an executive level representative responsible for company-wide data management, such as Chief Data Officer (CDO).

A core aspect of a CDO’s responsibilities should be simplifying processes with the help of the right technologies. However, it’s unlikely there’s a single tool that will do everything a financial organisation needs, and every governance strategy should be bespoke for the organisation that will follow it. Companies should be aiming for a “data quality by design” mindset, where the checks and processes that ensure top-quality data is maintained become second nature.

Comments are closed, but trackbacks and pingbacks are open.