Overview
On Site
$60 Per hour
Accepts corp to corp applications
Contract - W2
Contract - Independent
Skills
docker
Kubernetes
kafka
Apache kafka
Kafka Streams
Job Details
Job Title: Kafka Developer
Location: Charlotte, NC (Hybrid 3 days onsite per week)
Duration: Long-Term Contract
Job Summary
Client is seeking a highly skilled Kafka Developer to join its Enterprise Data & Messaging team. This role will focus on designing, building, and maintaining real-time streaming applications leveraging Apache Kafka. The ideal candidate will have strong experience in Java, Spring Boot, microservices, and event-driven architecture, with a proven ability to deliver scalable, resilient, and high-performance data pipelines.
Key Responsibilities
- Design, develop, and maintain Kafka producers, consumers, streams, and connectors.
- Implement real-time data pipelines and event-driven microservices using Kafka and Spring Boot.
- Collaborate with architects and engineers to integrate Kafka solutions with core banking and enterprise applications.
- Manage and optimize Kafka clusters, ensuring scalability, high availability, and fault tolerance.
- Monitor and troubleshoot topics, partitions, consumer groups, and offsets to maintain system health.
- Apply best practices in security, compliance, and governance in BFSI environments.
- Work in Agile/Scrum teams, participate in sprint planning, and ensure CI/CD integration with GitHub, Jenkins, and Harness.
Required Skills
- 8 10 years of software development experience, with a strong focus on Java (Spring Boot, Microservices).
- 4+ years of hands-on Kafka experience (Kafka Streams, Connect, Schema Registry, Confluent or MSK). Strong expertise in event-driven architecture and messaging patterns.
- Solid knowledge of SQL/NoSQL databases and integration with streaming pipelines.
- Experience deploying on cloud platforms (AWS, Azure, or Google Cloud Platform) and container orchestration (Docker, Kubernetes). Familiarity with CI/CD tools (GitHub, Jenkins, Harness, UrbanCode).
- Excellent problem-solving and communication skills. Banking/Financial Services domain experience preferred.
- Nice to Have Exposure to Spark, Hadoop, or other big data ecosystems.
- Experience with monitoring tools (Datadog, Splunk, Prometheus). Kafka certification (Confluent or equivalent) is a plus.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.