Overview
Remote
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 12 Month(s)
Skills
Kafka Engineer
Job Details
Position Senior Kafka Engineer (10+ Years Experience)
Location Remote
Duration 1+ Year (Extendable)
Job Description:
We are seeking a highly experienced Kafka Engineer with over 10 years of hands-on experience in building, maintaining, and optimizing distributed data systems using Apache Kafka. The ideal candidate will play a critical role in designing and implementing scalable, real-time streaming solutions that power our data infrastructure.
You will be responsible for developing robust data pipelines and streaming platforms, ensuring high availability, performance, and reliability across our data-driven systems.
Key Responsibilities:
- Architect, design, and implement real-time data pipelines using Apache Kafka, Kafka Streams, and Kafka Connect.
- Develop and maintain Kafka producers and consumers using Java or Python.
- Work with schema management tools such as Apache Avro or Protobuf.
- Integrate Kafka with various data sources and sinks using Kafka Connect.
- Optimize the performance and scalability of Kafka clusters in production environments.
- Monitor, tune, and troubleshoot issues using tools like Prometheus, Grafana, Confluent Control Center, or other monitoring solutions.
- Ensure data reliability, fault tolerance, and consistency across distributed systems.
- Work closely with DevOps, Data Engineering, and Application teams to ensure seamless Kafka integration across the organization.
- Maintain and document Kafka ecosystems, configurations, best practices, and operational guides.
Required Skills:
- 10+ years of experience in software engineering with a strong focus on distributed systems and real-time data streaming.
- Deep expertise in Apache Kafka internals, Kafka Streams, Kafka Connect, and Kafka security.
- Proficient in Java or Python for Kafka application development.
- Experience with Avro/Protobuf for schema definition and evolution.
- Proven experience in building scalable, fault-tolerant streaming data pipelines.
- Strong debugging, performance tuning, and problem-solving skills.
- Experience with CI/CD pipelines and version control systems like Git.
Preferred Qualifications:
- Experience with cloud platforms like AWS, Google Cloud Platform (Google Cloud Platform), or Azure.
- Hands-on experience with containerization (Docker) and orchestration tools (Kubernetes).
- Familiarity with Confluent Platform and enterprise Kafka solutions.
- Prior experience working in agile teams and participating in code reviews and sprint planning.
Education:
- Bachelor s or Master s degree in Computer Science, Information Systems, or a related technical field.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.