Data Migration & Data Feeds Specialist

Overview

Remote
Contract - W2

Skills

databricks
Azure Data Factory
JSON
Data Quality
Data Mining
Data Cleansing
Big Data
AWS Glue
Data Migration
Problem Solving
Data Integration
Data Pipelines
Systems Integration
snowflake
Data Structures
Medical Records
Test Planning
Data Validation
SQL Server Integration Services
SQL Stored Procedures
Application Programming Interfaces (APIs)
Self Motivation
Simple Object Access Protocol (SOAP)
Communication Skills
Extract Transform Load (ETL)
SQL Databases
Attention to Detail
Airflow
Data Modelling
Analytical Thinking
Acceptance Testing
Quality Management
Extensible Markup Language (XML)
Relational Databases
Fast Healthcare Interoperability Resources
Health Level Seven International
Data Transmissions
Data Streaming
Electronic Medical Records
Real Time Data
Insurance Claim Processing
Clinical Works
Document-Oriented Databases
File Transfer Protocol (FTP)
Informatica Powercenter
Quality Auditing
Talend
Test Data

Job Details

Job Title: Data Migration & Data Feeds Specialist Job Summary

We are seeking a skilled Data Migration and Data Feeds Specialist with strong experience in handling large datasets, building data pipelines, and ensuring seamless data integration across systems. The ideal candidate will have hands-on experience with ETL processes, API-based data feeds, file-based integrations, and data quality frameworks. Prior experience in healthcare data (EHR/EMR, claims, clinical, HL7/FHIR) is highly preferred.

Key Responsibilities
  • Lead and execute end-to-end data migration activities including data extraction, transformation, validation, cleansing, and loading into target systems.

  • Analyze legacy data structures and map to new system schemas to ensure accurate and complete data transfer.

  • Develop, maintain, and troubleshoot automated and real-time data feeds (API, SFTP, flat files, JSON, XML, HL7, FHIR, etc.).

  • Build and optimize ETL/ELT pipelines using industry-standard tools and cloud platforms.

  • Perform detailed data validation, reconciliation, and quality checks before and after migration.

  • Work with business and technical teams to understand data requirements, transformation logic, and integration needs.

  • Monitor daily/weekly data loads and proactively resolve feed failures or data discrepancies.

  • Document data mappings, migration rules, feed specifications, and error-handling procedures.

  • Collaborate with QA teams for test planning, test data preparation, and UAT support.

  • Support production go-live, data cutover planning, and post-migration audits.

Required Skills & Experience
  • 5 10+ years of hands-on experience in data migration, data integration, and data feed development.

  • Strong expertise in SQL (advanced queries, joins, stored procedures), data modeling, and relational databases.

  • Experience with ETL tools and technologies such as Informatica, Talend, SSIS, Snowflake, Databricks, Airflow, AWS Glue, Azure Data Factory, etc.

  • Experience working with API-based integrations (REST, SOAP) and file-based data feeds (CSV, XML, JSON).

  • Strong understanding of data quality, data cleansing, and reconciliation techniques.

  • Ability to understand and map complex datasets across multiple systems.

  • Excellent analytical skills and attention to detail.

  • Strong documentation and communication skill

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.