W2 - Kafka Developer

  • Dallas, TX
  • Posted 1 day ago | Updated 1 day ago

Overview

On Site
Depends on Experience
Contract - W2
Contract - 6 Month(s)
No Travel Required
Unable to Provide Sponsorship

Skills

Agile
Amazon Web Services
Java
Linux
DevOps

Job Details

< data-start="176" data-end="199">Job Title: Kafka Developer
Location: Dallas, TX(Onsite)</>< data-start="176" data-end="199">Job Description</>< data-start="200" data-end="521">We are looking for a Kafka Developer to design, develop, and support event-driven and real-time data streaming solutions using Apache Kafka. The ideal candidate will work closely with application teams to build scalable, reliable messaging pipelines and integrate Kafka with microservices and distributed systems.</>
< data-start="528" data-end="556">Key Responsibilities</>
  • < data-start="559" data-end="611">Design and develop Kafka producers and consumers</>
  • < data-start="614" data-end="663">Build and maintain event-driven architectures</>
  • < data-start="666" data-end="740">Develop data pipelines using Kafka topics, partitions, and consumer groups</>
  • < data-start="743" data-end="802">Integrate Kafka with microservices and backend applications</>
  • < data-start="805" data-end="850">Monitor and troubleshoot Kafka-related issues</>
  • < data-start="853" data-end="908">Optimize Kafka performance, throughput, and reliability</>
  • < data-start="911" data-end="992">Work with DevOps teams to deploy Kafka solutions in cloud or on-prem environments</>
  • < data-start="995" data-end="1054">Support testing, debugging, and production issue resolution</>

< data-start="1061" data-end="1084">Required Skills</>
  • < data-start="1087" data-end="1131">3+ years of experience with Apache Kafka</>
  • < data-start="1134" data-end="1223">Strong understanding of Kafka architecture (topics, partitions, brokers, replication)</>
  • < data-start="1226" data-end="1281">Experience developing Kafka producers and consumers</>
  • < data-start="1284" data-end="1336">Knowledge of event-driven and messaging patterns</>
  • < data-start="1339" data-end="1385">Experience with microservices architecture</>
  • < data-start="1388" data-end="1453">Familiarity with Java or Node.js (or other backend languages)</>
  • < data-start="1456" data-end="1514">Understanding of offset management and fault tolerance</>
  • < data-start="1517" data-end="1581">Basic knowledge of Kafka security (SSL, SASL – nice to have)</>
  • < data-start="1584" data-end="1622">Experience with Linux environments</>

< data-start="1629" data-end="1656">Nice to Have Skills</>
  • < data-start="1659" data-end="1690">Kafka Connect and Kafka Streams</>
  • < data-start="1693" data-end="1729">Schema Registry (Avro/JSON/Protobuf)</>
  • < data-start="1732" data-end="1767">Cloud platforms (AWS / Azure / Google Cloud Platform)</>
  • < data-start="1770" data-end="1791">Docker and Kubernetes</>
  • < data-start="1794" data-end="1832">Monitoring tools (Prometheus, Grafana)</>

< data-start="1839" data-end="1858">Soft Skills</>
  • < data-start="1861" data-end="1890">Strong problem-solving skills</>
  • < data-start="1893" data-end="1924">Good communication and teamwork</>
  • < data-start="1927" data-end="1982">Ability to work independently and in Agile environments</>
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.