Overview
Skills
Job Details
We are seeking a Senior Confluent Kafka Architect to join our Data Management team. This key role will be responsible for designing, implementing, and managing complex data streaming architectures using Apache Kafka and Confluent technologies. The ideal candidate is a hands-on expert in building enterprise-grade, real-time data pipelines and architectures.
< data-start="707" data-end="732">Responsibilities</>Lead the design and implementation of robust, scalable data-led integration architectures using Confluent Kafka, NoSQL databases, and SQL Server Data Warehouse.
Architect and build a ground-up Confluent Kafka platform for the enterprise.
Design and develop data streaming pipelines leveraging the Kafka ecosystem.
Troubleshoot, diagnose, and resolve Kafka implementation issues; participate in estimation and scoping for Kafka-related projects.
Ensure data quality, consistency, and low-latency performance across streaming pipelines.
Implement Kafka security best practices, including authentication, authorization, and encryption.
Collaborate and lead onshore/offshore teams to deliver scalable data streaming solutions.
Bachelor s degree (7+ years experience) or Master s degree (5+ years experience) in Computer Science, Engineering, Mathematics, or a related field.
5+ years as a Data Engineer with a focus on Kafka and streaming technologies.
2+ years hands-on experience with the Confluent Kafka Platform.
3+ years of Kafka architecture experience, leading enterprise streaming solutions (offset management, partition strategy, DR planning).
Expertise in Kafka Streams / kSQL for real-time stream processing.
Strong experience with Kafka Connect, designing and building end-to-end streaming data pipelines for real-time data ingestion, processing, and storage.
Deep knowledge of Kafka brokers, producers, consumers, topics, and event-driven architectures.
Proven ability to ensure data consistency, security, and high availability in enterprise environments.