Kafka Architect

  • Chicago, IL
  • Posted 5 hours ago | Updated 5 hours ago

Overview

On Site
$50 - $60
Accepts corp to corp applications
Contract - W2
Contract - 12 Month(s)
100% Travel
Able to Provide Sponsorship

Skills

Apache Kafka
Authentication
Authorization
Collaboration
Computer Science
Data Management
Data Quality
Data Warehouse
Database
Encryption
Estimating
FOCUS
High Availability
Management
Mathematics
Microsoft SQL Server
NoSQL
Offshoring
Real-time
Storage
Streaming

Job Details

We are seeking a Senior Confluent Kafka Architect to join our Data Management team. This key role will be responsible for designing, implementing, and managing complex data streaming architectures using Apache Kafka and Confluent technologies. The ideal candidate is a hands-on expert in building enterprise-grade, real-time data pipelines and architectures.

< data-start="707" data-end="732">Responsibilities</>
  • Lead the design and implementation of robust, scalable data-led integration architectures using Confluent Kafka, NoSQL databases, and SQL Server Data Warehouse.

  • Architect and build a ground-up Confluent Kafka platform for the enterprise.

  • Design and develop data streaming pipelines leveraging the Kafka ecosystem.

  • Troubleshoot, diagnose, and resolve Kafka implementation issues; participate in estimation and scoping for Kafka-related projects.

  • Ensure data quality, consistency, and low-latency performance across streaming pipelines.

  • Implement Kafka security best practices, including authentication, authorization, and encryption.

  • Collaborate and lead onshore/offshore teams to deliver scalable data streaming solutions.

< data-start="1546" data-end="1569">Qualifications</>
  • Bachelor s degree (7+ years experience) or Master s degree (5+ years experience) in Computer Science, Engineering, Mathematics, or a related field.

  • 5+ years as a Data Engineer with a focus on Kafka and streaming technologies.

  • 2+ years hands-on experience with the Confluent Kafka Platform.

  • 3+ years of Kafka architecture experience, leading enterprise streaming solutions (offset management, partition strategy, DR planning).

  • Expertise in Kafka Streams / kSQL for real-time stream processing.

  • Strong experience with Kafka Connect, designing and building end-to-end streaming data pipelines for real-time data ingestion, processing, and storage.

  • Deep knowledge of Kafka brokers, producers, consumers, topics, and event-driven architectures.

  • Proven ability to ensure data consistency, security, and high availability in enterprise environments.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About GNRSystems