Job Summary:
We are seeking an experienced Data Engineer with strong expertise in Azure cloud services, Data Engineering practices, and DevOps automation, combined with a solid understanding of the Property & Casualty (P&C) Insurance domain. The ideal candidate will play a key role in designing, building, and optimizing data solutions to support underwriting, claims, and policy analytics initiatives.
Key Responsibilities:
A solid understanding and preferably practical experience working with event driven architectures, especially those involving event streaming. Hands on experience with Azure Event Hubs is a big plus.
Design and develop data pipelines and ETL processes using Azure Data Factory, Databricks, and related Azure services.
Extensive professional experience working with serverless computing architectures specifically Azure functions.
Implement and maintain data models and data warehouses supporting P&C Insurance operations such as claims, policy, underwriting, and billing.
Collaborate with business stakeholders and product teams to translate insurance data requirements into technical solutions.
Utilize Azure DevOps for CI/CD automation, version control, and deployment of data solutions.
Exposure to DevOps methodologies with practical experience in Azure DevOps (ADO), including configuration of CI/CD pipelines, managing build releases, and automating deployments across development and production environments.
A solid understanding of the medallion architecture and underlying templatized pipelines for data estates would be good.
Ensure data quality, governance, and lineage across multiple data systems and reporting environments.
Work closely with data analysts, actuaries, and business teams to deliver insights and analytical solutions.
Develop scripts and automation to optimize data ingestion, transformation, and validation workflows.
Required Skills & Experience:
12+ years of experience in data engineering, data analytics, or related fields.
Strong knowledge of P&C Insurance processes (Claims, Policy Administration, Billing, Underwriting, etc.).
Hands-on experience with Azure Data Services: Azure Data Factory, Databricks, Synapse, ADLS, and Azure SQL DB.
Proficiency in DevOps tools Azure DevOps, Git, CI/CD pipelines.
Strong skills in SQL, Python, and data modeling (dimensional, relational, or lakehouse).
Experience in building and maintaining data pipelines, data marts, and data lake architectures.
Familiarity with reporting tools such as Power BI or other visualization platforms is an advantage.
Excellent communication and problem-solving skills with the ability to work in an agile team environment.