Overview
On Site
$70 - $80
Contract - W2
Contract - Independent
Contract - 12 Month(s)
Skills
Amazon Web Services
Apache Flink
Apache HTTP Server
Apache Hadoop
Apache Kafka
Cloud Computing
Collaboration
Continuous Delivery
Continuous Integration
Data Engineering
Data Integration
Data Processing
Databricks
DevOps
Disaster Recovery
FOCUS
Financial Services
Good Clinical Practice
Google Cloud Platform
Grafana
High Availability
IBM WebSphere MQ
Management
Messaging
Microsoft Azure
Migration
Python
RBAC
Real-time
Scripting
Shell
Snow Flake Schema
Streaming
TIBCO Software
Terraform
Job Details
Position: Kafka Architect
Location: McLean VA
Duration : Long Term
Role Overview
The Kafka Architect will be responsible for designing, implementing, and managing scalable, high-performance data streaming solutions using the Apache Kafka ecosystem, with a strong focus on Confluent Platform and Confluent Cloud. The role demands deep expertise in real-time data processing, event-driven architecture, and integration with modern cloud and data platforms.
Key Responsibilities
- Architect and implement enterprise-grade Kafka solutions using Confluent Platform/Cloud.
- Design and manage Kafka clusters, brokers, topics, and partitions for optimal performance and reliability.
- Lead migration efforts from legacy messaging systems (e.g., IBM MQ, TIBCO) to Kafka.
- Develop and optimize Kafka Streams, ksqlDB, Kafka Connect, and Flink-based pipelines.
- Ensure high availability, disaster recovery, and security (RBAC, ACLs) of Kafka infrastructure.
- Collaborate with data engineering, DevOps, and application teams to integrate Kafka into broader enterprise systems.
- Monitor and troubleshoot Kafka environments using tools like Grafana, Prometheus, and Confluent Control Center
Required Skills
- 8 10+ years of experience in Kafka architecture and implementation.
- Deep expertise in:
- Apache Kafka
- Confluent Platform / Confluent Cloud
- Kafka Connect, Kafka Streams, ksqlDB
- Zookeeper/KRaft
- Experience with cloud platforms (AWS, Google Cloud Platform, Azure) and CI/CD pipelines.
- Strong understanding of data integration tools (e.g., Snowflake, Databricks, Hadoop).
- Familiarity with scripting (Shell, Python) and infrastructure-as-code (Terraform).
- Financial services or credit union domain experience is highly preferred.
Certifications (Preferred)
- Confluent Certified Developer for Apache Kafka (CCDAK)
- Confluent Certified Administrator for Apache Kafka
- AWS/Google Cloud Platform Cloud Certifications
- ITIL Foundation
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.