Research & Thoughts

Data Fabric

Making a material difference at last

This Capco white paper presents a new pattern for successful integration of front-to-finance systems, across products and geographies, within an investment bank. Data Fabric is the ‘ultimate silo buster’, transforming disparate data into a common resource and shared strength. Data Fabric is the key to radically improved data quality, timeliness and availability. One of its most important impacts is significantly improved ROE. That alone makes it a development Investment Banks cannot afford to ignore. Download the point of view to learn more.

Why bother?
Investment Banks – IBs - cannot afford not to do this.

IBs are looking at a decline in ROE from 25% to 15%. This massive drop from pre-crisis returns is driven in large part by regulatory compliance costs, as well as a down market. They need now to fill the gap. The greater efficiencies brought by the data fabric approach will be a powerful contributor.

It is crucial to realize this is not about ‘keeping the geeks happy’. It is not even (exclusively) about regulatory compliance. The need for profound change is fundamentally a question of more efficient use of capital and collateral, maintaining parity with competitors and securing market advantages. The days of ‘data in a silo’ are numbered – external factors will ensure this, whether Investment Banks’ internal cultures like it or not. Data fabric is the way forward.

What is data fabric in principle?
The Data Fabric Integration Design pattern is the ‘ultimate silo buster’. It transforms the organization’s data from a series of disparate sources into a shared resource, shared currency and shared strength.

We’ve been hearing about data grids and data fabric for fifteen years. Why implement now?
The fundamental challenge now being addressed, and overcome, is data architecture. In brief, this is a question of how to model, consolidate, govern and manage the data. Many solutions exist to address particular, isolated aspects of the problem. These include master data management, data caching, replication, etc. The strength of the data fabric approach is that it acts as a hub for plugging in the required solutions. And at the same time, it offers a significant stepping-stone to the ‘nirvana’ of clean, consistent and current data, available as required.

We know there are advantages, but is data fabric still too disruptive and too painful?


Leave a comment

Comments are moderated and will be posted if they are on-topic and not abusive. For more information, please see our Comments FAQ
This question is for testing whether you are a human visitor and to prevent automated spam submissions.