Overview
Skills
Job Details
Responsibilities:
Architect, Design, and Develop scalable real-time event streaming and data processing applications using Java 11/17+ (Spring Boot, Multithreading, Reactive Programming).
Implement and optimize Kafka Connectors for seamless integration with AWS EventBridge, S3, DynamoDB, RDS, and other AWS services.
Design and implement event-driven architectu5res by integrating Confluent Kafka with AWS EventBridge to support serverless and microservices patterns.
Configure, deploy, and monitor Kafka Connect Source/Sink connectors for AWS EventBridge, Kinesis, DynamoDB, and Lambda.
Work on Kafka Streams and ksqlDB for real-time data processing and transformations.
Develop and manage Kafka Schema Registry with Avro, JSON Schema, and Protobuf.
Optimize Kafka cluster performance through partition tuning, consumer lag monitoring, and tiered storage.
Implement secure Kafka configurations (RBAC, ACLs, SSL/TLS, OAuth) and enforce best security practices.
Automate deployment of Kafka and EventBridge integrations using Terraform, Kubernetes, and Helm.
Develop CI/CD pipelines for real-time data streaming applications using Jenkins, GitHub Actions
Implement monitoring and observability using Confluent Control Center, AWS CloudWatch, Prometheus, and Datadog.
Collaborate with Data Engineers, DevOps, and Cloud teams to ensure seamless event-driven workflows.
Provide technical leadership and mentorship to junior engineers.
Core Requirements:
Spring Boot, Multithreading, Reactive Programming
Hands-off experience with kafka connectors (with AWS EventBridge, S3, DynamoDB, RDS, and other AWS services.)
Develop and manage Kafka Schema Registry with Avro, JSON Schema, and Protobuf.