Overview
Accepts corp to corp applications
Contract - 29 day((s))
Skills
Python
AWS
java
Kubernetes
Terraform
ELK
Prometheus
Grafana
Helm
Apache Kafka.
Brokers
Schema Registry
Control Center
ksqlDB
EKS
microservices integration.
Job Details
Job Summary:
We are seeking a Senior Kafka Engineer to manage, enhance, and scale an enterprise-
grade Apache Kafka implementation deployed on AWS and the Confluent Platform. This person will be
responsible for keeping the system reliable, improving it over time, and expanding it to support new
applications.
This role involves performing detailed architectural reviews, monitoring, performance tuning, optimizing
existing Kafka pipelines, and partnering with application teams to deliver reliable, secure, and performant
streaming solutions in a FinTech environment.
Role and Responsibilities:
- Manage and enhance existing Apache Kafka and Confluent Platform on AWS.
- Review existing implementations and recommend improvements.
- Collaborate with engineering and product teams to integrate new use cases and define
- scalable streaming patterns.
- Implement and maintain Kafka producers/consumers, Connectors, and Kafka Streams
- applications.
- Enforce governance around topic design, schema evolution, partitioning, and data retention.
- Monitor, troubleshoot, and tune Kafka clusters using Confluent Control Center, Prometheus,
- and Grafana.
- Use Kubernetes and Terraform to automate Kafka infrastructure deployment and scaling.
- Ensure high availability, security and disaster recovery.
- Mentor other engineers and provide leadership in Kafka-related initiatives.
Required Skills:
- 8+ years in platform engineering with 3+ years of hands-on experience with Apache Kafka.
- Proficiency in Kafka client development using Java or Python.
- Expertise with Confluent Platform (Brokers, Schema Registry, Control Center, ksqlDB).
- Experience deploying and managing Kafka on AWS (including MSK or self-managed EC2-based setups).
- Solid understanding of Kubernetes, especially EKS, for microservices integration.
- Hands-on experience with Kafka Connect, Kafka Streams, and schema management.
- Infrastructure automation experience with Terraform and Helm.
- Familiarity with monitoring and alerting stacks: Prometheus, Grafana, ELK, or similar.
Nice to Have:
- Prior experience in the FinTech domain or other regulated industries.
- Understanding of security best practices including TLS, authentication (SASL, OAuth), RBAC, and encryption at rest.
- Exposure to Apache Flink, Spark Streaming, or other stream processing engines.
- Experience establishing Kafka governance frameworks and multi-tenant topic strategies.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.