Overview
Skills
Job Details
Job Title
Confluent Kafka Streaming Engineer- Terraform |SQL CDC| Snowflake-Santand
Job Description
We are looking for a hands-on Kafka Streaming Engineer with expertise in Confluent Kafka, Terraform, and DevOps pipelines to design, implement, and maintain real-time streaming solutions. The ideal candidate will help stream data from SQL Server (using CDC) to Snowflake, working closely with DevOps and infrastructure teams. The role offers flexibility to work either onshore , and requires strong experience with TFS (Azure DevOps) and infrastructure as code (IaC) using Terraform.
Key Responsibilities
Build and support real-time data pipelines using Confluent Kafka and Kafka Connect.
Configure SQL Server CDC and stream change data to Kafka.
Design and deploy Kafka source and sink connectors, especially for SQL Server and Snowflake.
Use Terraform to provision and manage Kafka infrastructure including:
Topics
Connectors
ACLs
Schema Registry configurations
Automate deployments and topic creation using Azure DevOps (TFS) pipelines.
Collaborate with DevOps and data engineering teams to ensure end-to-end data delivery.
Monitor pipeline performance and troubleshoot issues as needed.
Document architecture, configuration, and deployment workflows.
Required Skills & Qualifications
Strong hands-on experience with Apache Kafka / Confluent Kafka ecosystem.
Expertise in Kafka Connect, Kafka Streams, or ksqlDB.
Practical experience in streaming data from SQL Server using CDC.
Experience integrating Kafka with Snowflake using connectors or Snowpipe Streaming.
Proficiency in Terraform to manage Kafka infrastructure.
Working knowledge of CI/CD pipelines using Azure DevOps (TFS).
Familiar with Confluent Schema Registry and serialization formats (Avro, JSON).
Good scripting/programming experience with Python, Java, or Scala.
Experience in cross-team collaboration (Data/DevOps/Cloud).
Preferred Qualifications
Confluent Certified Developer or Terraform Associate Certification.
Familiarity with Snowpipe Streaming, Kafka REST Proxy, or custom connector development.
Understanding of Kafka security: SSL, SASL, RBAC.
Exposure to cloud-native Kafka solutions (e.g., Confluent Cloud on AWS/Azure).Job Title: Confluent Kafka Streaming Engineer (Terraform | SQL CDC | Snowflake)
Roles & Responsibilities
Key Responsibilities
Build and support real-time data pipelines using Confluent Kafka and Kafka Connect.
Configure SQL Server CDC and stream change data to Kafka.
Design and deploy Kafka source and sink connectors, especially for SQL Server and Snowflake.
Use Terraform to provision and manage Kafka infrastructure including:
Topics
Connectors
ACLs
Schema Registry configurations
Automate deployments and topic creation using Azure DevOps (TFS) pipelines.
Collaborate with DevOps and data engineering teams to ensure end-to-end data delivery.
Monitor pipeline performance and troubleshoot issues as needed.
Document architecture, configuration, and deployment workflows.
Required Skills & Qualifications
Strong hands-on experience with Apache Kafka / Confluent Kafka ecosystem.
Expertise in Kafka Connect, Kafka Streams, or ksqlDB.
Practical experience in streaming data from SQL Server using CDC.
Experience integrating Kafka with Snowflake using connectors or Snowpipe Streaming.
Proficiency in Terraform to manage Kafka infrastructure.
Working knowledge of CI/CD pipelines using Azure DevOps (TFS).
Familiar with Confluent Schema Registry and serialization formats (Avro, JSON).
Good scripting/programming experience with Python, Java, or Scala.
Experience in cross-team collaboration (Data/DevOps/Cloud).
Job Reference Number
18300
Skills to be evaluated on
Confluent-KafkamTerraformDevOps-pipelines-to-design-implement-and-maintain-real-time-streaming-solutionsstream-data-from-SQL-Server-(using-CDC)-to-Snowflakeworking-closely-with-DevOps-and-infrastructure-teamsstrong-experience-with-TFS-(Azure-DevOps)-and-infrastructure-as-code-(IaC)-using-Terraform.Data StreamingData Engineeringand maintain real-time streaming solutions.help-stream-data-from-SQL-Server-(using-CDC)-to-Snowflake-working-closely-with-DevOps-and-infrastructure-teams
Mandatory Skills
Data Streaming , Data Engineering,hands-on Kafka Streaming Engineer with expertise in Confluent Kafka, Terraform, and DevOps pipelines to design, implement, and maintain real-time streaming solutions.,strong experience with TFS (Azure DevOps) and infrastructure as code (IaC) using Terraform.