Overview
On Site
$60 - $65
Accepts corp to corp applications
Contract - W2
Contract - 12 Month(s)
Skills
Property and Casualty
P&C
snowflake
databricks
MDM
Data Governance
ETL SQL
Job Details
Title: Data Engineer P&C Insurance
Location: Chicago, IL
Employment Type: Contract
Job Summary:
- P&C domain insurance.
- Data ETL, SQL both should be strong.
- 10+ Years experience.
- Both Development and testing is required.
- Communication should be excellent.
Key Responsibilities:
- Design, build, and maintain scalable and reliable ETL/ELT pipelines to ingest, transform, and integrate data from policy, claims, billing, and external insurance data sources.
- Collaborate with business stakeholders, actuaries, underwriters, and data scientists to translate P&C insurance domain requirements into robust data models.
- Develop and optimize data warehouses, data lakes, and cloud-based platforms (AWS/Azure/Google Cloud Platform) to support reporting and analytics.
- Work with structured and unstructured data, including exposure, risk, and claims data.
- Ensure data quality, governance, and lineage are maintained across data ecosystems.
- Collaborate with cross-functional teams to support predictive modeling, loss reserving, fraud detection, and pricing analytics.
- Automate data workflows and monitoring for high performance and reliability.
- Maintain documentation of data pipelines, dictionaries, and insurance data mappings.
Required Skills & Qualifications:
- Bachelor s/Master s degree in Computer Science, Data Engineering, Information Systems, or related field.
- 10+ years of experience in data engineering roles, preferably in the P&C insurance domain.
- Strong hands-on experience with ETL tools (Informatica, Talend, DataStage, Matillion, etc.) or custom ETL using Python/Scala/Spark.
- Proficiency with SQL, data modeling, and relational databases (Oracle, SQL Server, PostgreSQL, etc.).
- Experience with cloud data platforms (AWS Redshift, Azure Synapse, Google Cloud Platform BigQuery, Snowflake).
- Familiarity with insurance data standards (ACORD, ISO, policy, claims, billing datasets).
- Knowledge of big data frameworks (Hadoop, Spark, Databricks).
- Strong understanding of data governance, lineage, and master data management (MDM) in an insurance context.
- Excellent communication and collaboration skills.
Preferred Qualifications:
- Experience in Actuarial/Underwriting data integration.
- Familiarity with Guidewire, Duck Creek, or other P&C core platforms.
- Exposure to machine learning pipelines for predictive modeling in insurance.
- Knowledge of regulatory and compliance requirements for insurance data (NAIC, GDPR, HIPAA, SOX).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.