Kafka Principal Data Engineer, Streaming & Architecture @ remote work

Overview

Remote
$80 - $90
Contract - Independent
Contract - W2
Contract - 12 Month(s)
No Travel Required

Skills

Apache Kafka
Change Data Capture
Data Validation
Extract
Transform
Load
IT Management
Informatica MDM

Job Details

Kafka Principal Data Engineer, Streaming & Architecture

Location: remote work

Duration: 12 Months+

We are seeking a highly skilled Data Engineer and Technical Lead to spearhead the transformation of legacy batch data pipelines into a modern, event-driven architecture using Apache Kafka. This role will lead the design and implementation of streaming data pipelines and ensure seamless integration with Informatica MDM.

Key Responsibilities:

- Design and deliver scalable data architecture for streaming ingestion.

- Convert batch file loads from claim systems to Kafka-based streams.

- Implement change data capture (CDC) using Debezium or similar tools.

- Ensure data validation through dual-run testing and retire legacy batch loads.

- Collaborate with client data engineering teams via paired programming and knowledge transfer.

Required Skills:

- Strong experience with Apache Kafka and CDC tools.

- Proficiency in data pipeline development, data engineering, data architecture, and data validation strategies.

- Agile delivery experience and excellent communication skills.

- (Nice to have) Familiarity with Informatica MDM integration.

- (Nice to have) Project management experience

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.