Overview
Skills
Job Details
Kafka Principal Data Engineer, Streaming & Architecture
Location: remote work
Duration: 12 Months+
We are seeking a highly skilled Data Engineer and Technical Lead to spearhead the transformation of legacy batch data pipelines into a modern, event-driven architecture using Apache Kafka. This role will lead the design and implementation of streaming data pipelines and ensure seamless integration with Informatica MDM.
Key Responsibilities:
- Design and deliver scalable data architecture for streaming ingestion.
- Convert batch file loads from claim systems to Kafka-based streams.
- Implement change data capture (CDC) using Debezium or similar tools.
- Ensure data validation through dual-run testing and retire legacy batch loads.
- Collaborate with client data engineering teams via paired programming and knowledge transfer.
Required Skills:
- Strong experience with Apache Kafka and CDC tools.
- Proficiency in data pipeline development, data engineering, data architecture, and data validation strategies.
- Agile delivery experience and excellent communication skills.
- (Nice to have) Familiarity with Informatica MDM integration.
- (Nice to have) Project management experience