The Need for Data Quality in Banks

by Suganthi Shivkumar    Nov 10, 2009

Poor data quality is widespread in every organization. Unfortunately, most organizations
accept this as a day-to-day operational challenge and devise both
simple and complex work-arounds to compensate for the data’s
shortcomings.

Organizations involved in financial risk exposures suffer from poor data quality, but nonetheless are able to function
with apparent efficiency. For example, a Data Quality Assessment
revealed that a particular bank had over 2 billion in corporate loan
exposures without maturity dates. (Source: Informatica)

This finding highlighted both poor data quality and poor business process.
But while the results are appalling, such data quality did not seem to
have affected the bank’s business-up until the recent credit crunch. So
in the past, the bank would not have prioritized this data deficiency.
It would have deemed other issues more worthy of attention and budget.
Now senior level executives recognize data quality as critical in
supporting a range of banking reports.

Basel II is one of the major drivers of change within the banking world. Because it is used to
assess risk, the underlying quality of the data is critical to being
able to deliver a report with any level of confidence. Financial
institutions are adopting Basel II not simply because it is a
compliance directive but also because it is for many the embodiment of
best practice.

Regulations

Basel II (and particularly Pillar II of the Accord) puts responsibility on financial
institutions in the area of data quality and data management. Banks
must look at the accuracy of their risk exposure calculations
throughout the entire business. For many, this encompasses the
exposures from businesses in many different countries.

Regulators
such as the Financial Services Authority/FSA (United Kingdom), the
Federal Reserve (United States), and the Bundesbank (Germany) have made
it a requirement that banks self certify the accuracy, completeness,
and appropriateness of Basel-critical data. Banks must now tailor their
data management strategy to meet this requirement. Even the RBI has
recently issued a notification laying down a time schedule for all
scheduled commercial banks operating in the country for implementation
of the advanced approaches for the regulatory capital measurement under
Basel II framework.

An example of the explicit requirements for
data quality is highlighted in the FSA’s application pack for internal
ratings-based (IRB) approvals:

‘Describe how the firm ensures
that IRB (internal ratings based) data standards are met, and in
particular how it ensures the accuracy, completeness, and
appropriateness of the data underlying the firm’s regulatory capital
calculations.’

This criterion effectively moves data quality
out of the "it-would-be-nice-to-fix" status into an issue that must be
addressed to comply with banking regulations.

Banks need to
establish quantified and documented targets and robust processes to
test the accuracy of data in the following ways:

* Reconcile inputs and outputs of capital calculation with accounting systems
* Assign every exposure a probability of default (PD), loss given default (LGD) and, if applicable, a credit conversion factor
* Establish key risk indicators to monitor and ensure data accuracy

* Fully document processes for business and IT infrastructure

* Set clear and documented standards on ownership and timeliness of data

* Develop a comprehensive quantitative audit program

These
new priorities require consolidated data collection across the
institution, so that data from all business units is brought together
into a single source, typically a data warehouse from which reports are
generated for risk and Basel II related decisions.(Source: FSA: CP 05/3
(January 2005) and BIPRU (section 4.2.5) )

Data Quality and Basel II

Almost
all leading banks have addressed these key priorities by investing in
the data infrastructure: data warehouses, risk engines, business
intelligence (BI) layers, and data integration software.

But at
no point in the data stream is data quality managed as an explicit
function. Instead, it is dealt with by tools not designed specifically
for the purpose. This is an important oversight because data quality is
a vital intersection point of infrastructure and the business. More
importantly, data quality is an explicit requirement for Basel II
compliance.

Score carding

Score carding became a
focal point for data quality in Basel II when the FSA’s CP 189 proposed
score carding as an external audit point.

BASEL II DATA ACCURACY SCORECARDS

High standards of Data Accuracy: We propose quantifiable targets to cover completeness and accuracy that will rise over time.

Robust
control and systems environment: Firms are encouraged to develop
automated data capture processes to safeguard the integrity of the
calculation and reporting process with full and appropriate levels of
documentation.

Proposed standard: A self-assessment data
accuracy scorecard (DAS) that includes a mix of regulator and firm
specified targets that can be accessed through quantifiable tests that
we will agree with firms on an individual basis.

Structure of
Data Accuracy Score card: We will set core targets that will apply to
all firms. Firms will set supplementary targets and we will agree to
these on an individual basis depending upon relevance to the firm. The
tests that are applied will be firm specific and agreed with us.

Scorecard
will comprise prescribed areas, attainment targets and tests For
example, ‘completeness’ is an area (all assets have to be captured),
the target is 100%, and the test is reconciliation to the report and
accounts. - Financial Services Authority.

Data Quality (DQ) Firewalls

The
chosen solutions should extend this compliance score carding approach
to apply "data quality firewalls" in front of the risk engines, be they
in-house ones or those from third parties.

The firewall’s main
function is to identify poor data quality before it goes into the
engine, which removes the requirement for manual data remediation on
the risk engine’s log files and ensures that only high-quality data
enters the risk engines. Firewalls perform both automated and manual
tasks. For example, errors in non- transactional client reference data
can be automatically standardized, cleansed, and/or enriched on the
fly. Errors in transactional data are identified and presented to
business analysts for rapid remediation.

The ideal risk
solution should perform analysis on all types of master data: Customer
and counterparty data, market and credit data, financial, reference,
and transactional data. Therefore, this includes key data related to:
Probability of default, loss given default, exposure at default

Risk and Basel II DQ Management

The chosen solutions should provide a data quality management framework that gives the business total assurance to:

*
Manage data quality on an aligned and integrated basis, meeting best
practice on legacy data management and new business development

*
Measure and to monitor the data quality using:  Existing and newly
created internal reference data sources, third-party reference data
sources, the solutions own reference data

* Act on areas identified for improvement without threatening the quality of existing data

* Handle change requests and new developments without threatening the quality of existing data

*
Guarantee to senior management and the board the accuracy of the data
being stored, being generated, and being used for decisions

* Match data against trusted reference sources for validation and enrichment

* Monitor and cleanse, on an ongoing basis, gaps in data accuracy and identify incidences of non- conformance

*
Deploy a data quality firewall ensuring that new data is consistent
with the risk management and Basel II requirements, among others

DQ Starter Pack: Risk & Basel II Management

While
deciding on the starter pack for Risk & Basel II Management, the
chosen solution should allow users the following benefits:

*
Framework data quality rules in such areas as key attributes for:  Risk
weighted asset calculation, probability of default calculation,
exposures (dates, amounts, and limits), obligors (dates, basic
address), ratings (obligor and product), and securitization.

All of the rules are extendable for customer-specific requirements.

*
A schema for BI vendor independent reporting that supports: High-level
aggregated data quality metrics for senior management, drill-down by
multiple dimensions, detailed results, including indicators of
potential loss on a per business rule basis.

(The author is the MD of Informatica South Asia)