Sr. Kafka Data Integration Engineer- Atlanta, GA (Onsite)

Overview

On Site
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)
No Travel Required

Skills

Azure
JAVA
splunk
Python
Kafka
Integration
data pipelines
REST APIs
observability
Kafka Connect
Prometheus
log data formats
ingestion pipelines

Job Details

Role Name: Data Integration Engineer (Kafka & Azure)
Location: Atlanta, GA (Onsite)
Duration: 12 Months

 

Key Responsibilities: • Develop Kafka consumers and implement minimal data transformations. • Integrate data pipelines with Azure Monitor and other observability tools. • Build and maintain connectors using Kafka Connect and REST APIs. • Collaborate with cross-functional teams to ensure data quality and reliability. Required Skills: • Strong programming skills in Java, Python, or Scala. • Experience with Kafka Streams, Consumer API, and Schema Registry. • Proficiency in Azure SDKs and event-driven architecture. • Hands-on experience with Splunk, Prometheus, and PromQL. • Familiarity with log data formats and ingestion pipelines. Good to Have: • Experience with Splunk HEC integration. • Kafka Connect plugin setup and customization. • Exposure to CI/CD and infrastructure automation.

 

Best Regards,

Vishal

Truth Lies in  Heart

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.