Overview
Skills
Job Details
Job Title: Lead Confluent/Kafka Engineer | (NYC, NY)
Duration: 12+ Months
Experience: 10+ Years
Job Description
- New architecture pattern for data consumption; going through the Kafka process
- Need experience in this space, DEVELOPING on a kafka topic or cluster (dont have to have adminstered on the cluster)
- 5-7+ years of exp ; building pipelines, object stories, using kafka products
- Other candidates lacked UI on dashboard exp, how do you implement using the connector (didn't have the right answers)
- Someone who understands the data platform; self-managed/full managed connectors, ETL using those pipelines
- Glider Assessment in the works and will be provided once created.
- The candidates for the Kafka didn t have the below needed experience:
Real-world experience setting up actual kafka real time data streaming
Responsibilities and Duties:
- Responsibility 1: Architect and design scalable, fault-tolerant, and high-performance Kafka-based data streaming solutions. Lead technical design sessions and provide architectural guidance to junior engineers.
- Responsibility 2: Develop and maintain Confluent Platform components, including Kafka Connect, Kafka Streams, Flink, TableFlow and ksqlDB. Implement and manage monitoring and alerting systems for the Kafka cluster.
- Responsibility 3: Lead troubleshooting and resolution of complex issues within the data streaming platform. Perform root cause analysis and implement corrective actions to prevent future occurrences. Mentor and guide junior engineers.
- Responsibility 4: Collaborate with other engineering teams to integrate data streams into their applications. Develop and maintain comprehensive documentation for the data streaming platform. Provide technical leadership and guidance on best practices.
- Participate in code reviews and ensure adherence to coding standards and best practices.
- Proactively identify and address potential performance bottlenecks and scalability issues.
- Contribute to continuous improvement initiatives
- Stay up to date on the latest Confluent Platform and Kafka technologies.
Required Qualifications:
1. Bachelor's degree in Computer Science, Engineering, or a related field. Master's degree preferred.
2. 5+ years of experience with Confluent Platform and Kafka, including experience in designing and implementing large-scale data streaming solutions.
3. Proficient in Java or other JVM languages (e.g., Scala, Kotlin). Experience with Kafka Connect, Kafka Streams, and ksqlDB. Strong understanding of Kafka architecture and concepts (topics, partitions, consumers, producers). Experience with message queuing systems. Familiarity with cloud-based environments (AWS, Azure, Google Cloud Platform). Excellent problem-solving and debugging skills. Experience with CI/CD.
4. Experience leading and mentoring engineering teams. Strong architectural skills. Experience with performance tuning and optimization. Experience with schema registries (e.g., Avro).
5. Confluent Certified Engineer
6. Ability to work independently and as part of a team. Excellent communication and collaboration skills.
Preferred Qualifications:
1. Experience with other streaming technologies (e.g., Spark Streaming, Flink).
2. Experience with containerization technologies (e.g., Docker, Kubernetes).
3. Experience with data visualization and analytics tools.