Software Engineer Interoperability & Data Platforms

Remote • Posted 3 hours ago • Updated 3 hours ago
Contract W2
Remote
$60 - $85/hr
Fitment

Dice Job Match Score™

👤 Reviewing your profile...

Job Details

Skills

  • Interoperability
  • API
  • Apache Hadoop
  • Big Data
  • Data Engineering
  • Data Integration
  • Extract
  • Transform
  • Load
  • ELT
  • Google Cloud Platform
  • Health Care
  • HL7
  • Python
  • PySpark
  • OAuth
  • JSON
  • JavaScript
  • RESTful
  • DevOps
  • Informatica PowerCenter
  • Informatica
  • GitLab

Summary

Job Title: Software Engineer Interoperability & Data Platforms
Duration: 6+ months (High possibility of extension)
Location: Fully remote

Job Summary
We are seeking a highly skilled Software Engineer Interoperability to design, build, and support enterprise scale healthcare interoperability and data integration solutions. This role supports CMS ONC, BCBSA, and enterprise interoperability initiatives, with a strong focus on FHIR based APIs, SmileCDR, and high volume ETL/ELT pipelines built on modern DataWorks platforms.
The ideal candidate brings hands on development experience across SmileCDR, HL7 FHIR, Informatica Big Data Management (BDM), Python/PySpark, JavaScript, REST APIs, and cloud native data platforms.

Key Responsibilities
Interoperability & API Development
Design, configure, and develop FHIR based interoperability solutions using SmileCDR.
Implement and support HL7 FHIR resources (US Core, Da Vinci, CMS mandated APIs).
Develop and maintain RESTful APIs for Patient Access, Provider Directory, Prior Authorization, and Payer to Payer use cases.
Build API integrations using JavaScript, REST, OAuth2, and JSON payloads.
Integrate external interoperability platforms such as Redox and third party healthcare APIs.
Support API performance, monitoring, security, and compliance.
SmileCDR & FHIR Platform Responsibilities
Configure and manage SmileCDR repositories, FHIR endpoints, and data ingestion pipelines.
Develop FHIR mapping, transformation, and validation logic.
Implement SmileCDR workflows, interceptors, subscriptions, and data persistence strategies.
Support SmileCDR upgrades, patches, and production troubleshooting.
Data Engineering & ETL / ELT (High Volume Data)
Design and build large scale, high volume ETL/ELT pipelines supporting clinical, claims, and member datasets.
Develop pipelines using Python, PySpark, and distributed processing frameworks.
Perform Source to Target Mapping, data transformations, and data quality validations.
Support incremental, batch, and near real time ingestion patterns.
Informatica Big Data Management (BDM)
Develop and maintain data pipelines using Informatica Big Data Management (BDM).
Implement mappings, workflows, and transformations for Hadoop based and cloud data platforms.
Integrate Informatica BDM with Hadoop, Hive, Spark, and cloud storage.
Optimize BDM jobs for performance, scalability, and reliability.
Support metadata management, lineage, and operational monitoring.
DataWorks Platforms & Tools
Work with modern data engineering tools and platforms, including:
DBT transformations, modeling, and analytics ready datasets
Starburst / Trino federated query and analytics
Apache Iceberg large scale table format and versioned datasets
Google Cloud Platform / BigQuery cloud native analytics and storage
Use Smile ETL tools and enterprise ingestion frameworks.
Support hybrid architectures spanning on prem Hadoop and cloud platforms.
DevOps, CI/CD & Cloud Engineering
Build and maintain CI/CD pipelines using GitLab Pipelines.
Follow DevOps best practices for source control, automated testing, and deployments.
Support cloud deployments on Google Cloud Platform.
Ensure secure, repeatable, and compliant deployments across environments.
Collaboration & Delivery
Partner with product owners, architects, vendors, and compliance teams.
Support CMS ONC regulatory timelines and audit readiness.
Participate in code reviews, design reviews, and technical documentation.
Provide production support and root cause analysis for critical data and API pipelines.

Required Skills & Experience
Interoperability & API Skills
SmileCDR (FHIR repository, configuration, ingestion, and APIs)
HL7 FHIR (US Core, Da Vinci implementation experience)
REST APIs, OAuth2, JSON
JavaScript for API and integration development
Redox or equivalent healthcare integration platforms
Data Engineering & ETL
ETL / ELT pipelines for large scale, high volume datasets
Python and PySpark
Source to Target Mapping
Data Modeling
Informatica & Big Data
Informatica Big Data Management (BDM)
Informatica PowerCenter / IDMC (preferred)
Hadoop, Hive, Spark
Data Works & Analytics Platforms
DBT
Starburst
Apache Iceberg
Google Cloud Platform / BigQuery
DevOps & Cloud
GitLab CI/CD Pipelines
DevOps and release management practices
Cloud platforms (Google Cloud Platform preferred)

Preferred Qualifications
Healthcare payer or provider interoperability experience
CMS ONC / BCBSA mandate experience
HL7 FHIR certification or equivalent hands on implementation experience
Experience supporting production systems and regulatory reporting
Education:
Bachelor s or Master s degree in Computer Science, Engineering, or equivalent experience.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10123373
  • Position Id: SAN7057
  • Posted 3 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

Today

Easy Apply

Contract, Third Party

Remote

Yesterday

Easy Apply

Contract

45 - 47

Remote

Today

Easy Apply

Third Party, Contract

$1 - $1

Remote

23d ago

Easy Apply

Contract

Depends on Experience

Search all similar jobs