This engagement was a multi-year effort to increase operational excellence, reduce overall costs, and improve operational agility for the investment division of a global insurance company.
A key part of this vision was to enable the strategic use of data as an asset by leveraging a robust aggregation, analytics, and reporting capability, often referred to as a data lake. NEOS worked with the client to define a data lake solution that serves the needs of not only the Investment Division, but the rest of the firm as well – an enterprise data lake (EDL). In addition to the initial work performed by NEOS, expertise was needed in defining the solution architecture and implementation plan for the data lake within the client’s reporting and analytics workstream. NEOS designed and implemented the entire process of data intake and curation as well as the orchestration between platforms such as Bl required to ensure it followed a repeatable nightly process.
The client had started their global optimization data workstream to address four key pain points:
The program had started to identify an overall architecture but needed help in architecting a data lake, identifying key use cases, and road mapping to implement the lake while progressively gaining value from it.
In the first phase of the project, nine data sources were ingested into the data lake (including SimCorp and Bloomberg Polar Lake data), curated, with three data marts being created. Processes were developed to ingest data and update the data marts daily including quality checking the feeds and curation.
This was accomplished in the first six months and was the only part of the program to be production ready on time. Over the next three years, the sources grew to 25+, including data from FactSet and Sylvan, and ingestion happens multiple times daily. The consumer side grew to 40 data marts with eight teams of consumers using them daily and others accessing on an ad-hoc basis.
The client now has a 5-zone data lake for delivering updated consumable data marts for reporting based on several disparate data sources including Bloomberg Polar Lake Data, SimCorp general ledger data, Sylvan performance data, Factset data and many custom internal sources.
Due to the nature of the data lake ingestion changes and enhancements, reporting can be done in hours instead of months. The project was delivered on-time and within budget even while new data sources were being introduced and changed. The client is now able to manage data intake from North America and Asia within SLAs, enriching their data lake with significantly more global data which results in enhanced business insights to consumers and data scientists.
For more information on Delivering Data at Scale please contact Robert Nocera.