Kafka Architect - McLean, VA

Overview

On Site
$65 - $70
Contract - W2
Contract - 6 Month(s)
No Travel Required

Skills

Kafka
Salesforce Community Cloud
Application Integration Architecture
Deep expertise in Apache
Kafka confluent
Platform Confluent
Cloud Kafka Connect
ksqlDB
Zookeeper
KRaft
Kafka Streams
AWS
GCP
Azure
CI/CD Pipelines
Snowflake
Databricks
Hadoop
Shell
Python
Terraform
RBAC
ACLs
disaster recovery
security
DevOps
IBM MQ
TIBCO

Job Details

Note; No C2C for this one, as the client will be paying the candidate directly.
Client wants local candidates to McLean, VA

MUST HAVE:

10 years of experience and above in Kafka architecture and implementation.
Deep expertise in Apache, Kafka confluent, Platform Confluent, Cloud Kafka Connect, Kafka Streams, ksqlDB, Zookeeper, KRaft
Experience with cloud platforms (AWS, Google Cloud Platform, Azure) and CICD pipelines.
Strong understanding of data integration tools (e.g., Snowflake, Databricks, Hadoop).
Familiarity with scripting (Shell, Python) and infrastructure-as-code (Terraform).

Job Summary

The Kafka Architect will be responsible for designing, implementing, and managing scalable, high-performance data streaming solutions using the Apache Kafka ecosystem, with a strong focus on Confluent Platform and Confluent Cloud.
The role demands deep expertise in real-time data processing, event-driven architecture, and integration with modern cloud and data platforms.
10 years of experience and above in Kafka architecture and implementation.
Deep expertise in Apache, Kafka confluent, Platform Confluent, Cloud Kafka Connect, Kafka Streams, ksqlDB, Zookeeper, KRaft
Experience with cloud platforms (AWS, Google Cloud Platform, Azure) and CICD pipelines.
Strong understanding of data integration tools (e.g., Snowflake, Databricks, Hadoop).
Familiarity with scripting (Shell, Python) and infrastructure-as-code (Terraform).
Financial services or credit union domain experience is highly preferred. Architect and implement enterprise-grade Kafka solutions using Confluent Platform cloud.
Design and manage Kafka clusters, brokers, topics, and partitions for optimal performance and reliability.
Lead migration efforts from legacy messaging systems (e.g., IBM MQ, TIBCO) to Kafka.
Develop and optimize Kafka Streams, ksqlDB, Kafka Connect, and Flink-based pipelines.
Ensure high availability, disaster recovery, and security (RBAC, ACLs) of Kafka infrastructure.
Collaborate with data engineering, DevOps, and application teams to integrate Kafka into broader enterprise systems.
Monitor and troubleshoot Kafka environments using tools like Grafana, Prometheus, and Confluent Control Center.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About SmartTech Staffing Partners