Position Opened Date | 8/27/2025 |
Position ID | Kafka Administrator |
Position Title | Senior Kafka Administrator with Ansible |
Vendor Notes | Must be on-site five days a week in Woodlawn, MD |
Planned Start | 9/15/2025 |
Key Required Skills: Kafka Architecture, Ansible Automation, RHEL/Linux Administration, Scripting (Bash, Shell, Python), Availability Monitoring / Triage (Splunk, Dynatrace, Prometheus). |
Position Description: - Architect, design, develop, and implement next-generation data streaming and event-based architecture / platform using software engineering best practices in the latest technologies:
- Data Streaming, Event Driven Architecture, Event Processing Frameworks
- DevOps (Jenkins, Red Hat OpenShift, Docker, SonarQube)
- Infrastructure-as-Code and Configuration-as-Code (Ansible, Terraform / CloudFormation, Scripting)
- Administer Kafka including automating, installing, migrating, upgrading, deploying, troubleshooting, and configuring on Linux.
- Provide expertise in one or more of these areas: Kafka administration, event-driven architecture, automation, application integration, monitoring and alerting, security, business process management/business rules processing, CI/CD pipeline and containerization, or data ingestion/data modeling.
- Investigate, repair, and actively ensure business continuity regardless of impacted component: Kafka Platform, business logic, middleware, networking, CI/CD pipeline, or database (PL/SQL and Data Modeling).
- Brief management, customer, team, or vendors using written or oral skills at appropriate technical level for audience
- All other duties as assigned or directed
|
Skills Requirements: FOUNDATION FOR SUCCESS (Basic Qualifications) - Bachelor's Degree in Computer Science, Mathematics, Engineering or a related field.
- Masters or Doctorate degree may substitute for required experience
- 8+ years of combined experience with Site Reliability Engineering, providing DevOps support, and/or RHEL administration for mission-critical platforms, ideally Kafka.
- 4+ years of combined experience with Kafka (Confluent Kafka, Apache Kafka, Amazon MSK)
- 4+ years of experience with Ansible automation
- Must be able to obtain and maintain a Public Trust. Contract requirement.
*** Selected candidates must be willing to work on-site in Woodlawn, MD 5 days a week. FACTORS TO HELP YOU SHINE (Required Skills) These skills will help you succeed in this position: - Strong experience with Ansible Automation and authoring playbooks and roles for installing, maintaining, or upgrading platforms
- Solid experience using version control software such as Git/Bitbucket including peer reviewing Ansible playbooks
- Hands-on experience administrating Kafka platform (Confluent Kafka, Apache Kafka, Amazon MSK) via Ansible playbooks or other automation.
- Understanding of Kafka architecture, including partition strategy, replication, transactions, tiered storage, and disaster recovery strategies.
- Strong experience in automating tasks with scripting languages like Bash, Shell, or Python
- Solid foundation of Red Hat Enterprise Linux (RHEL) administration
- Basic networking skills
- Solid experience triaging and monitoring complex issues, outages, and incidents
- Experience with integrating/maintaining various 3rd party tools like ZooKeeper, Flink, Pinot, Prometheus, and Grafana.
- Experience with Platform-as-a-Service (PaaS) using Red Hat OpenShift/Kubernetes and Docker containers
- Experience working on Agile projects and understanding Agile terminology.
HOW TO STAND OUT FROM THE CROWD (Desired Skills) Showcase your knowledge of modern development through the following experience or skills: - Preferred Confluent Certified Administrator for Apache Kafka (CCAAK) or Confluent Certified Developer for Apache Kafka (CCDAK)
- Practical experience with event-driven applications and at least one event processing framework, such as Kafka Streams, Apache Flink, or ksqlDB.
- Understanding of Domain Driven Design (DDD) and experience applying DDD patterns in software development.
- Experience working with Kafka connectors and/or supporting operation of the Kafka Connect API
- Experience with Avro / JSON data serialization and schema governance with Confluent Schema Registry.
- Preferred experience with AWS cloud technologies or other cloud providers; AWS cloud certifications.
- Experience with Infrastructure-as-Code (CloudFormation / Terraform, Scripting)
- Solid knowledge of relational databases (PostgreSQL, DB2, or Oracle), NoSQL databases (MongoDB, Cassandra, DynamoDB), SQL, or/and ORM technologies (JPA2, Hibernate, or Spring JPA)
- Knowledge of Social Security Administration (SSA)
|
Education: - Bachelor's Degree with 7+ years of experience
- Must be able to obtain and maintain a Public Trust. Contract requirement.
|