Sr Data Engg Lead dev with Strong Databricks and Python experience 100% remote

Overview

Remote
Depends on Experience
Contract - Independent
Contract - W2
Contract - 12 month(s)
No Travel Required

Skills

Azure
Python
Spark
SAS
Databricks
Migration
Synapse
Data Engg

Job Details

Location:100% Remote 
Type: Contract 
Industry: Healthcare 
Client is transitioning from SAS to Databricks to power next-generation actuarial analysis, underwriting, and risk modeling.
We’re seeking a Data Engineer to lead this transformation—
refactoring legacy SAS pipelines, modernizing data architecture, and building scalable solutions on Databricks and Azure.
 
 What You’ll Do

•     Migrate and refactor SAS-based actuarial and underwriting models to Databricks
•     Design and implement scalable data pipelines using Azure Data Factory, Synapse, and Spark
•     Collaborate with actuarial teams, business stakeholders, and engineering peers
•     Optimize performance, reliability, and agility of data workflows
•     Contribute to real-time analytics and AI-driven innovation across health plan operations
 
 Required Qualifications
•     Bachelor’s or Master’s in Computer Science, Statistics, Applied Math, or related field
•     5+ years of programming experience (SQL and Python required; Java, R, or Spark a plus)
•     4+ years working with Databricks or similar cloud-native platforms
•     Hands-on experience with SAS-to-Databricks migration projects
•     Strong SQL skills and experience with Oracle, PostgreSQL, MySQL, or SQL Server
•     Experience with Git and collaborative development practices
•     Proven ability to work cross-functionally and communicate with technical and business teams
•     Expertise in data management, software engineering, and infrastructure operations
 
 Preferred Skills
•     Healthcare domain knowledge
•     Agile/Scrum experience
•     Azure ecosystem: Data Factory, Synapse, Purview, Cosmos, App Insights, Power BI
•     Python libraries: NumPy, pandas, matplotlib; Jupyter notebooks
•     CI/CD, ML Ops, and DataOps practices
•     Event streaming tools: Kafka, NiFi, Flink
•     Big Data tools: Spark, Hive, Sqoop
•     NoSQL experience (MongoDB)
•     Advanced Power BI modeling (Power Query, DAX)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.