Senior Data Engineer- Property and Casualty/Insurance Experience

Overview

Remote
$115,000 - $120,000
Full Time
Unable to Provide Sponsorship

Skills

Expect
Data Science
Data Storage
Data Structure
Data Warehouse
Databricks
Extract
Transform
Load
Insurance
Machine Learning (ML)
Management
Master Data Management
Financial Services
Good Clinical Practice
Google Cloud Platform
Microsoft Azure
Computer Science
Customer Analysis
Artificial Intelligence
Big Data
Cloud Computing
Collaboration
Communication
Amazon Web Services
Analytics
Apache Hadoop
Property And Casualty Insurance
Regulatory Reporting
Scala
Snow Flake Schema
Amazon Redshift
Apache Kafka
Apache Spark
Data Engineering
Data Governance
KPI
Microsoft Power BI
Mobile Device Management
Pricing
Python
SQL
Tableau
Team Leadership
Terraform
Underwriting
Visualization
ELT
Warehouse
Actuarial Science

Job Details

Senior Data Engineer- Property and Casualty Insurance Exp

**NOT OPEN FOR C2C-ONLY OPEN FOR W2 FULL TIME SEEKERS

LOCATION:- REMOTE (at this time, could change in the future)

Must Have s:

- Communication is KEY!

- Property & Casualty is a big plus Insurance experience is a must.

Job Description

Seeking a Senior Data Engineer to design and implement scalable data architectures and analytics platforms for clients in the Property &Casualty (P&C) insurance sector. In this role, you will lead the development of modern data pipelines and cloud data solutions that support critical insurance operations like underwriting, claims, pricing, and customer analytics.


Key Responsibilities

  • Architect and implement cloud-native data solutions across AWS, Azure, or Google Cloud Platform platforms.
  • Design and optimize scalable ETL/ELT pipelines using tools like Databricks, Apache Spark, and Kafka.
  • Manage and develop cloud-based data storage and warehouse solutions, including Snowflake, BigQuery,

Redshift, and Azure Synapse.

  • Collaborate with actuarial, underwriting, and analytics teams to deliver data products that support core

insurance KPIs and regulatory reporting.

  • Support the deployment and operationalization of AI/ML models into scalable pipelines.
  • Ensure best practices around data governance, security, and master data management (MDM).
  • Lead & collaborate with cross-functional teams in the delivery of data engineering and cloud analytics

initiatives.

  • Apply domain knowledge to handle complex P&C insurance data, including claims, policy, premium, and

exposure data models.

Qualifications

  • Bachelor s degree in Computer Science, Engineering, Data Science, or a related field.
  • 9+ years of experience in data engineering, preferably in financial services or insurance.
  • Proven experience building and optimizing data pipelines in a production environment.
  • Strong SQL and Python skills; experience with Scala is a plus.
  • Hands-on experience with big data platforms such as Databricks, Hadoop, Spark, or similar frameworks.
  • Proficiency in at least one major cloud platform (AWS, Azure, or Google Cloud Platform)
  • Experience working with modern data warehouses like Snowflake, BigQuery, or Redshift.
  • Familiarity with P&C insurance data structures, actuarial or regulatory reporting is a plus.
  • Excellent communication and team leadership skills.

Preferred Experience With:

  • Cloud ETL tools: Airflow, dbt, Azure Data Factory, AWS Glue
  • Infrastructure as code: Terraform, CloudFormation
  • Visualization tools: Power BI, Tableau
  • Data governance frameworks and MDM tools


Note: We do not expect expertise in every tool listed. Strong foundational skills and experience with similar technologies will be considered.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Appic Solutions