Data Modeling
Simplify and Standardize Financial Data Processing
industry
Insurance
technology used:
Contact
Let's talk
Data Modeling
Simplify and Standardize Financial Data Processing
industry
Insurance
technology used:
Simplify and Standardize Financial Data Processing
Outline of the Challenge:
A world wide leader in Insurance business was aiming to replace existing financial systems with a centralized solution. The vision was not only to enable financial posting but to enable regulatory and management reporting both for the group and the local entities.
Due to a very complex source landscape the project is divided into waves.
Each wave brings new local entities on board but also raises new requirements to the cloud solution both from a central and local perspective.
Proposed Solution
Multi Tenant Architecture back with a scalable Data Vault modeling approach divided into core features and local extensions.
Team Size
2 Experienced Data Models from Data Hiro
Data Modeling
Project Journey
The journey of implementing a Data Vault solution involves several key stages, each of which contributes to the successful deployment and adoption of the data platform
1
Discovery and Planning:
The journey began with discovery and planning, where stakeholders identified the business objectives, data integration needs, and key requirements for the Data Vault solution. This stage involved conducting stakeholder interviews, gathering business requirements, and assessing the current state of data infrastructure and systems.
2
Core Data Modeling and Design:
Once the requirements were understood, the next step was to design the Data Vault schema. This involved creating hub tables, link tables, and satellite skeletones to represent business entities, relationships, and descriptive attributes.
3
Group Data Model Governance Definition:
In order to share the mode with local entities and maintain the growth a Data model governance was established. It included standards, guidelines, and best practices for creating, documenting, and managing data models. These standards ensured consistency, interoperability, and reusability of data models across the organization.
4
Local Entities Onboarding:
The core Data model and governance principles was rolled out to each and every of the entities that were part of the program. Clear boundaries were set between core model parts and local model ex tensions.
5
The journey doesn't end here:
it's an ongoing process of optimization and continuous improvement.
1
The journey began with discovery and planning, where stakeholders identified the business objectives, data integration needs, and key requirements for the Data Vault solution. This stage involved conducting stakeholder interviews, gathering business requirements, and assessing the current state of data infrastructure and systems.
2
Once the requirements were understood, the next step was to design the Data Vault schema. This involved creating hub tables, link tables, and satellite skeletones to represent business entities, relationships, and descriptive attributes.
3
In order to share the mode with local entities and maintain the growth a Data model governance was established. It included standards, guidelines, and best practices for creating, documenting, and managing data models. These standards ensured consistency, interoperability, and reusability of data models across the organization.
4
The core Data model and governance principles was rolled out to each and every of the entities that were part of the program. Clear boundaries were set between core model parts and local model ex tensions.
5
it's an ongoing process of optimization and continuous improvement.
Results
A group wide data vault model and framework for finance that support both local and group requirements.
Data Modeling
Key findings
Regardless of the modeling approach used, the lack of data governance will make any approach impossible to implement.
The Data Vault Approach with its flexibility played a crucial role in program success. Nonetheless the method has its shortcomings with chaining requirements and schema design a proper migration/backward compatibility approach is needed. Last but not least any Data Vault implementations needs to be verified and adjusted against the tool stack that was selected.
Interested contact us or schedule a call!