Data Engineer

Overview

On Site
Hybrid
Depends on Experience
Contract - W2
Contract - Independent
Contract - 6 Month(s)

Skills

data

Job Details

Job Title: Data Engineer (13494-1)

Location: Princeton, NJ (Hybrid 3 days onsite per week)
Employment Type: Contract
Industry: Insurance / Reinsurance
Experience Level: Mid-Senior
Experience Required: 10+ Years
Education: Bachelor s Degree Required
Relocation Assistance: Not Provided
Total Openings: 1

Job Overview:

Marsh McLennan s Technology group is seeking a skilled Data Engineer to support the Global Analytics and Advisory business of Guy Carpenter. This role is part of a product-centric, enterprise-wide actuarial modernization effort supporting thousands of users worldwide. The ideal candidate will have significant hands-on experience with Microsoft and Azure BI ecosystems and a strong background in building scalable, enterprise-grade data pipelines and data warehouse solutions.

Key Responsibilities:

  • Design, develop, and maintain data pipelines using Azure Data Factory, Azure Databricks, and Azure Data Lake.
  • Build and manage data layers (staging, bronze, silver, gold) for downstream analytics and reporting.
  • Develop, process, and optimize complex SSAS Tabular Models and schedule them using Azure Automation Runbooks.
  • Work on ETL processes using ADF for structured and unstructured data from heterogeneous sources.
  • Collaborate with data analysts, actuarial teams, and software developers in Agile/SCRUM settings.
  • Document technical processes, pipeline logic, and data lineage for support and audit readiness.

Technical Skills (Mandatory):

  • Azure BI & Data Stack:
    • Azure Data Factory (ADF)
    • Azure Databricks
    • Azure Analysis Services (SSAS)
    • Azure Data Lake Store (ADLS)
    • Azure Data Lake Analytics
    • Azure SQL
    • Azure Integration Runtime
    • Azure Event Hubs
    • Azure Stream Analytics
    • DBT (Data Build Tool)
  • Additional Tools & Technologies:
    • SQL Server Integration Services (SSIS)
    • SQL Server Reporting Services (SSRS)
    • PySpark
    • MongoDB
  • Data Warehouse Expertise:
    • Data modeling with Normalization/De-Normalization
    • Multi-layer data architecture (staging, bronze, silver, gold)
    • ETL pipeline design and automation

Preferred Domain Experience:

  • Actuarial or Insurance domain knowledge is highly preferred, especially:
    • Reinsurance broking workflows (placements, treaty structures, renewals)
    • Understanding actuarial rating models (exposure, experience, program layers)
    • Building data pipelines supporting actuarial pricing tools, analytics, and client reporting

Soft Skills & Team Fit:

  • Strong communicator with excellent analytical and problem-solving skills
  • Familiar with Agile/SCRUM practices and sprint-based delivery
  • Self-motivated, reliable, and able to work independently in a global delivery team
  • Proficient in documentation and technical writing
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Pinnacle Software Solutions