Fragmentation of inputs into supply chains has been a challenge across sectors for decades. Multiple origins, varying quality and major validation requirements can create inefficiencies and risks. Fragmentation in sourcing and supply of data to financial institutions is one of the more recent ‘supply chains’ to come under scrutiny. Financial institutions face challenges in the procurement and management of data sets used to inform investment strategies and to support vital business operations. Data fragmentation occurs when data is derived from disparate sources, leading to breaks in data lineage. It is an issue many industries share but for capital markets firms, a potential solution can be found by focusing on the quality and reliability of the data source.
The process of data acquisition, management and distribution has a significant bearing on the quality, consistency, cost and value of data downstream. Now that there is more data to consume and more systems to feed with this content, operational efficiency becomes essential to an organization’s future success.
Data managers responsible for data supply chains highlight that the challenges inherent in fragmented data are numerous.
First, symbologies and identifiers vary across sources. Basic data structures and data sets are different, requiring significant time, effort and analytical computing power to make them homogeneous.
Second, financial institutions will notice inconsistency and incongruity in asset valuations and risk calculations when the data feeding these models is inconsistent, or when data sets are incorporated at different frequencies. If a trader and a risk officer have different risk values, they cannot work effectively together, or worse, real risk exposure could be overlooked.
These factors can create process breakdowns between the front, middle and back offices as inconsistencies from multiple sources are resolved. Ultimately, data from multiple sources leads to higher operational risks and costs.
A Bloomberg qualitative study conducted last July of Chief Data Officers (CDOs), still a formative role in many companies in Asia, found they were grappling with educating key internal stakeholders on the importance of quality data, governance and compliance, particularly in the face of sweeping regulation through Dodd-Frank, MiFID II, GDPR and BCBS239.
As the CDO role becomes more established and companies get on top of regulatory requirements, CDOs typically begin shifting focus to surfacing efficiencies, value and insights from business data, the study found – and hence the emerging focus on tackling data fragmentation.
Data in Asia’s multi-dimensional regulatory landscape
The picture across the financial services sector is one of multiple sources of data ultimately driving greater complexity, requiring larger investments in human and technology capital to sort it out. Multiple distribution technologies can result in duplication in integration and maintenance. Significant data overlap is a cost inefficiency and fragmented data sources require multiple contractual relationships and models, further increasing complexity.
In Asia, data fragmentation has tended to be multi-dimensional, ranging across geographies, jurisdictions and regulatory regimes. While the EU and US are universally bound by regulations such as MiFID II and Dodd-Frank, there is no singular regulatory overlay for Asia. Combined with radically different political, economic, markets and business environments, high level data consolidation can be difficult to achieve in this region.
Calls to better understand and integrate the data supply chain are growing however, not least because of the increasingly borderless nature of both data and delivery of financial services and the need for companies operating in Asia to still comply with or at least achieve alignment with the US and EU requirements.
That said, Asian firms have an opportunity to leapfrog their Western counterparts. Asian banks for example have in recent years started to build their own technology, and have had to make some tough data choices – to adhere to one data source, or to use multiple sources of data. Increasingly, machine learning and predictive analytics are compelling companies to build their data lakes in-house, so they have singularity of the source versus relying on many third parties. Making the right data choice at the beginning will set the course for many firms for the future.
The emerging market opportunity: One data, one source?
In recent years, we have observed that the financial services industry’s data needs are broad and ever evolving. There is a growing need for coverage of all asset classes and instruments; trustworthy reference and master data, real-time market data, pricing and valuation data; and business critical analytics and risk calculations. Real-time data is particularly critical for firms assessing market liquidity, tracking volatility and managing risks.
All of these factors are compounded by extensive regulatory and accounting requirements which demand complex and defendable data linkages. While data needs are becoming more complex, there is a greater need from CDOs for simplification of data and technology processes.
Interestingly, some financial services players in emerging markets may have an advantage over longer established and more global players. Being cost sensitive and often sparsely resourced ironically creates a hedge against falling into the fragmented data trap. Many of these firms prefer lighter, cloud-ready technology platforms that offer flexibility and agility. Acquiring high quality data from a comprehensive source can help create efficiencies in cost and compliance management by reducing the knock-on technological, energy, legal and operational costs of connectivity to multiple data sources.
Over the long-term, the industry may eventually be moving towards one data source for their primary data needs, and identifying a reliable technology partner that can serve as their trusted single data source. One data source creates data consistency across desks, reduces data breaks across business workflows, and reduces operating risks and overall data costs. Cost as we understand from our clients, is the greatest pain point for CDOs today. In fact, 60% of our survey respondents said they plan to move to a single primary data vendor, and a further 31% said they are looking to decrease vendors.
One thing appears to be beyond dispute – as market data, analysis and insight becomes even more important and regulatory data needs grow, access to quality data through less fragmented sources will rise up the agenda of CDOs and global data leaders around Asia.
[The author is Global Head of Enterprise Data, Bloomberg LP]