Senior ETL Data Engineer (Healthcare)

Atlanta, GA, US • Posted 4 hours ago • Updated 4 hours ago
Full Time
On-site
$50 - $60/hr
Company Branding Image
Fitment

Dice Job Match Score™

🤯 Applying directly to the forehead...

Job Details

Skills

  • ASC X12
  • Amazon DynamoDB
  • Amazon Kinesis
  • Amazon Redshift
  • Amazon S3
  • Amazon SageMaker
  • Amazon Web Services
  • Analytics
  • Apache HTTP Server
  • Apache Kafka
  • Computerized System Validation
  • Continuous Delivery
  • Continuous Integration
  • Data Engineering
  • Data Quality
  • Data Warehouse
  • ELT
  • Electronic Data Interchange
  • Engineering Design
  • Extract
  • Transform
  • Load
  • GitHub
  • HCPCS
  • HIPAA
  • HITECH
  • HL7
  • Health Care
  • ICD-10
  • JSON
  • Management
  • NPI
  • Orchestration
  • Performance Tuning
  • Stored Procedures
  • Streaming
  • Terraform
  • Testing
  • XML
  • PostgreSQL
  • PySpark
  • Python
  • Regulatory Compliance
  • SQL
  • Step-Functions

Summary

Job Title: Senior ETL Data Engineer (Healthcare)

Location: Atlanta, GA(Onsite)
Healthcare Experience: 5+ Years (Mandatory)
Client: Simplify/Enlace Health


Role Summary

We are looking for a Senior Data ETL Engineer with strong expertise in AWS-based data engineering and healthcare claims processing. This role involves designing and managing large-scale, HIPAA-compliant data pipelines handling millions to hundreds of millions of claims records.

The candidate will act as a technical leader, working closely with analytics, clinical, compliance, and product teams.


Must-Have Skills

Healthcare Domain (Mandatory)

  • Strong experience with ANSI X12 EDI transactions: 837P, 837I, 837D
  • Knowledge of full claims lifecycle:
    • 835 (ERA), 270/271 (Eligibility), 276/277 (Claim Status)
  • Experience with:
    • ICD-10, CPT, HCPCS, NPI, Revenue Codes
  • Understanding of HIPAA 5010 compliance
  • Experience handling large-scale claims data (millions+)

AWS & Data Engineering

  • Strong hands-on with:
    • AWS Glue (PySpark ETL pipelines)
    • Amazon Redshift (data warehousing & performance tuning)
    • Amazon Athena
    • Amazon S3 & Lake Formation
  • Experience with:
    • Apache Iceberg (schema evolution, partitioning, time travel)
    • Amazon Kinesis (streaming ingestion)
    • AWS Step Functions / Lambda

Programming & ETL

  • Strong in Python / PySpark
  • Experience building ETL/ELT pipelines at scale
  • Handling multi-format data:
    • EDI, JSON, CSV, XML, APIs, HL7 FHIR

Databases & SQL

  • Expert-level SQL:
    • Joins, CTEs, window functions, query optimization
  • Hands-on experience with:
    • Amazon DynamoDB (GSI/LSI, single-table design)
    • PostgreSQL (partitioning, indexing, stored procedures)

Orchestration & DataOps

  • Apache Airflow (MWAA) DAG development
  • dbt transformations, testing, modeling
  • CI/CD tools:
    • GitHub Actions / AWS CodePipeline
  • Infrastructure as Code:
    • Terraform or CloudFormation

Data Governance & Compliance

  • Experience with:
    • Data quality tools (Great Expectations / AWS Deequ)
    • Data lineage & monitoring (CloudWatch, SNS)
  • Strong knowledge of:
    • HIPAA / HITECH compliance
    • Encryption (KMS), IAM access control

Key Responsibilities

Claims Data Processing

  • Process and validate EDI 837 transactions at scale
  • Handle complete claims lifecycle workflows
  • Work with multi-source healthcare data ingestion

ETL & Data Architecture

  • Build scalable AWS Glue pipelines
  • Design Iceberg-based data lakes
  • Optimize Redshift data warehouse performance

Data Engineering

  • Design and manage DynamoDB & PostgreSQL systems
  • Optimize queries for large-scale datasets

Orchestration & Automation

  • Build and maintain Airflow DAGs
  • Implement CI/CD pipelines and automation

Data Quality & Governance

  • Ensure data accuracy, lineage, and auditability
  • Maintain compliance with healthcare regulations

Advanced Analytics / ML

  • Work with SageMaker / Redshift ML
  • Build anomaly detection & duplicate claim detection systems

Preferred / Nice-to-Have

  • AWS Certification (Data Engineer / Solutions Architect)
  • Experience with:
    • Apache Kafka / Amazon MSK
    • AWS HealthLake / FHIR platforms
    • HEDIS, HCC/RAF models
    • Data Mesh architecture
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91156937
  • Position Id: 8964458
  • Posted 4 hours ago

Company Info

About RITWIK Infotech Inc

RITWIK Infotech, Inc an ORACLE Cloud partner and ISO 9001:2008 certified company, is a IT consulting and software services company focusing on Enterprise Applications and Solutions in Oracle E-Business Suite, Oracle Fusion Cloud Applications, Netsuite Cloud ERP and Business Intelligence Technologies. 

Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Atlanta, Georgia

19d ago

Easy Apply

Full-time

$50 - $60

Remote

19d ago

Easy Apply

Full-time

$40 - $50

Remote

Today

Easy Apply

Contract

$40 - $50

Orange, Connecticut

Today

Easy Apply

Full-time

$60 - $70

Search all similar jobs