Kafka Engineer / Administrator / Developer

Remote • Posted 5 hours ago • Updated 5 hours ago
Contract W2
Remote
$50 - $50/hr
Fitment

Dice Job Match Score™

✨ Finding the perfect fit...

Job Details

Skills

  • kafka

Summary

Kafka Engineer / Administrator / Developer
Pay: $50/hr on W2
Location: Remote


The Kafka Engineer / Administrator / Developer is a key member of the program technical team, supporting large-scale data streaming, system integration, and platform modernization initiatives. This role is responsible for designing, developing, administering, and optimizing Apache Kafka clusters and event-driven architectures that support high-volume, mission-critical data flows. The Kafka Engineer works closely with Federal Government stakeholders, architects, developers, DevOps teams, API Gateway (APIGW) teams, and backend system owners to ensure reliable, secure, and scalable event streaming pipelines. This role plays a critical part in enabling real-time data integration, microservices communication, and operational resilience across complex enterprise systems.



Key Functions

Kafka Engineering & Administration

Design, build, administer, and maintain Kafka clusters across development, test, and production environments.
Manage Kafka topics, partitions, brokers, replication, retention policies, and access controls.
Monitor Kafka performance, availability, throughput, and latency; proactively identify and resolve issues.
Perform capacity planning, tuning, upgrades, patching, and disaster recovery planning for Kafka environments.
Implement and maintain high availability and fault-tolerant Kafka configurations.
Event Streaming & Integration

Develop and support event streaming pipelines using Kafka for real-time and near-real-time data processing.
Integrate Kafka with API Gateway (APIGW) based microservices and downstream backend systems.
Design and implement Kafka producers, consumers, and connectors (e.g., Kafka Connect) to support system integrations and ETL/data movement needs.
Collaborate with application teams to define event schemas, topics, and data contracts.
Ensure reliable message delivery, data integrity, and error handling across streaming workflows.
Security, Compliance & Operations

Implement Kafka security best practices, including authentication, authorization, encryption in transit, and auditing.
Ensure Kafka implementations comply with CMS security, data governance, and operational standards.
Support DevSecOps practices, CI/CD pipelines, and infrastructure-as-code approaches where applicable.
Participate in incident response, root cause analysis, and operational readiness activities.
Collaboration & Documentation

Work closely with architects, developers, DevOps engineers, and system administrators to support solution design and delivery.
Document Kafka architectures, configurations, operational procedures, and integration patterns.
Provide technical guidance, troubleshooting support, and knowledge transfer to internal teams.


Minimum Qualifications

Bachelor s degree in Computer Science, Information Technology, Engineering, or a related field.
3+ years of experience developing, administering, and supporting Apache Kafka in enterprise environments.
Hands-on experience managing Kafka clusters, topics, partitions, and event streaming pipelines.
Experience integrating Kafka with microservices, API Gateways (APIGW), and backend systems.
Strong understanding of event-driven architectures, messaging patterns, and data streaming concepts.
Experience with Linux-based environments and command-line administration.
Strong troubleshooting and performance tuning skills.
Ability to clearly communicate technical concepts to both technical and non-technical stakeholders.


Preferred Qualifications

Experience supporting federal healthcare programs.
Experience working in Agile, Scrum, and/or DevSecOps environments.
Familiarity with cloud-based Kafka deployments (AWS MSK or similar managed Kafka services).
Experience with CI/CD pipelines and automation tools.
Knowledge of cloud security concepts and secure data transmission.
Experience with monitoring tools and observability platforms for Kafka (e.g., Prometheus, Grafana, CloudWatch).
Familiarity with schema management tools (e.g., Schema Registry).
Knowledge of containerized environments and orchestration tools (Docker, Kubernetes) is a plus.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91132048
  • Position Id: 8921935
  • Posted 5 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

4d ago

Easy Apply

Third Party, Contract

$50 - $60

Remote

Today

Easy Apply

Contract, Third Party

Depends on Experience

Remote

4d ago

Easy Apply

Contract

Depends on Experience

Remote

Today

Easy Apply

Third Party, Contract

Depends on Experience

Search all similar jobs