a.s.r. asset management creates a data architecture in collaboration with Xomnia

The asset management division of Dutch insurance giant a.s.r. collaborated with Xomnia to successfully develop and implement a comprehensive data architecture. The project was executed in two distinct phases, from strategic planning to practical execution.

Initiated in February 2022, the partnership began with a focus on understanding the requirements necessary to establish a high-level architectural framework. Progressing from there, the relevant teams concentrated on crafting a detailed, low-level design and setting up a Proof of Concept (PoC) to validate the proposed value.

The tangible result of this collaboration is a robust, Azure-based platform. This platform adeptly utilizes DevOps practices within tools such as Azure Data Factory, Synapse, and AzureML. A key feature of this system is its efficient separation of development, testing, and production environments. Currently, the platform hosts multiple ETL (Extract, Transform, Load) pipelines and models, demonstrating its capacity for handling data operations.

"The architecture we built with Xomnia gives us a headstart for the analytical challenges that we will face over the coming years." Huub Stam, Manager Data Analytics at a.s.r. asset management

Challenge

Over years, various individuals and teams within the asset management department of a.s.r. had independently developed a range of tools, data pipelines, models, and dashboards locally on their work computers. Upon completion, these components were typically deployed to on-premises servers. However, this process often lacked standardized guidelines for distinguishing between development and production environments.

Recognizing the need for a more structured approach, the asset management BI department aspired to transition from their on-premises data warehouse to a centralized data platform on Azure Cloud. Their goal was to establish a DTAP (Development, Testing, Acceptance, and Production) setup, enhancing the organization of both formal data streams (for regulatory bodies like DNB) and sandbox environments (for business and data science purposes).

a.s.r.’s asset management department sought the expertise of Xomnia to address the challenge of scattered data and to create a unified repository for building, running, accessing, and managing all data and models. This initiative aimed to streamline their data management and leverage the advanced capabilities of cloud-based solutions.

Read other client stories: Strategize, organize, capitalize: the journey of ABN AMRO Verzekeringen towards data-driven impact

Solution

The client’s initial requirement was to draft a design of the architecture needed to create such a platform, from high-level requirement gathering to a low-level design tested in a PoC environment.

First, setting the strategy

The partnership between a.s.r.'s asset management department and Xomnia commenced with an analytics translator from Xomnia leading the initial exploratory phase. This first step was dedicated to understanding and addressing the unique challenges that the client faces.

The analytics translator played a role in enlightening the a.s.r. team about the potential of cloud systems. This guidance included addressing critical questions about optimal solutions for hosting applications in the cloud, identifying the most suitable Azure building blocks, and advising on the best profiles for recruitment to support these initiatives.

The result of this initial phase was the decision to create a data processing platform characterized by a clear separation of environments. This was to be achieved through the adoption of key DevOps principles, such as continuous integration and continuous deployment (CI/CD).

Next, creating data pipelines and applying use cases

During the first phase, we started with a PoC to validate certain assumptions in the architecture and design of the platform. Following this, Xomnia’s consultant focused on developing CI/CD pipelines. These pipelines were designed to ensure a seamless flow of work from the development phase to the actual production environment. They were specifically tailored for Azure Data Factory, Azure Synapse and AzureML.

Once the architecture was established and the appropriate DevOps pipelines were in place, we moved on to deploying various projects within the cloud environment. Although these projects had different components, they also shared many similarities. To manage the data requirements of these projects, ETL (Extract, Transform, Load) pipelines were set up. This allowed the necessary data to be readily available within the Azure Data Lake.

Furthermore, the models developed in these projects were exported to AzureML and Synapse and integrated into the ETL pipelines. We also connected reporting tools to the new data resources in the data lake, enabling easier access to and analysis of the information.

Impact

Adhering to DevOps principles and effectively structuring the MLOps environment facilitated the process of building stable and scalable data pipelines. This forms a good foundation for developing and productionalizing data flows in the future, which the asset management team can use. The data platform is ready for more projects/use cases to be implemented and deployed on the platform.

The anticipated impact of the project is fostering a more data-driven approach for a.s.r. 's asset management team, and enabling them to fully harness the potential of their data. The integration of DevOps principles in Azure will facilitate improved team synchronization, real-time data updates, and robust version control. Ultimately, this will lead to a simpler, more efficient, and transparent work process for both a.s.r.’s employees and their clients.