Kafka Engineer

Overview

On Site
$50 - $55
Contract - W2
Contract - 6 Month(s)

Skills

Kafka Engineer
Java
Scala
Python
AWS
Azure

Job Details

Job Title: Kafka Engineer
Location: Toronto, Canada
Job Type: Contract Role
Experience: 8 years
Job Summary:
We are seeking a highly skilled Kafka Engineer to design, build, and maintain real-time data streaming solutions using Apache Kafka. The ideal candidate will have experience in developing and managing distributed systems, with strong expertise in Kafka ecosystem components.
Key Responsibilities:
  • Design, develop, and manage scalable and reliable Kafka-based data pipelines
  • Implement Kafka producers, consumers, and stream processors for real-time and batch data processing
  • Monitor, optimize, and troubleshoot Kafka performance issues and bottlenecks
  • Manage Kafka cluster setup, configuration, upgrades, and maintenance
  • Ensure high availability, data consistency, and fault tolerance in Kafka environments
  • Collaborate with data engineers, developers, and DevOps teams to integrate Kafka with various systems
  • Implement data governance, security policies, and access controls for Kafka topics
  • Write automation scripts for operational tasks and deployments
Required Skills:
  • Hands-on experience with Apache Kafka, Kafka Connect, and Kafka Streams
  • Strong knowledge of Kafka internals, message delivery semantics, and partitioning strategies
  • Proficiency in Java, Scala, or Python
  • Experience with Kafka monitoring tools like Confluent Control Center, Prometheus, Grafana, etc.
  • Familiarity with schema management tools like Confluent Schema Registry and Avro
  • Understanding of distributed systems, event-driven architecture, and microservices
  • Knowledge of REST APIs and integration with external data sources (e.g., databases, APIs, file systems)
Nice to Have:
  • Experience with Confluent Platform or AWS MSK
  • Knowledge of Docker, Kubernetes, and CI/CD pipelines
  • Experience with cloud platforms like AWS, Azure, or Google Cloud Platform
  • Understanding of data streaming platforms like Apache Flink or Apache Spark
  • Exposure to DevOps practices and Infrastructure as Code (IaC)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.