Kafka Cloud Architect

  • Baltimore, MD
  • Posted 12 hours ago | Updated 12 hours ago

Overview

On Site
USD 126,100.00 - 227,950.00 per year
Full Time

Skills

Data Warehouse
Mentorship
Modeling
Data Integrity
System Integration
Enterprise Services
SOA
Business Process Management
Business Rules
Data Modeling
Computer Science
Mathematics
Software Development
Software Analysis
Systems Analysis/design
Security Clearance
Capacity Management
Disaster Recovery
Extract
Transform
Load
Replication
Performance Tuning
Management
High Availability
SLA
Microservices
Computer Networking
Load Balancing
Use Cases
Apache ZooKeeper
Writing
Apache Avro
JSON
Data Compression
Apache Flink
Amazon RDS
Amazon S3
Relational Databases
PostgreSQL
IBM DB2
PL/SQL
Object-relational Mapping
Hibernate
JPA
Streaming
Amazon Web Services
Cloud Computing
Continuous Integration
Continuous Delivery
DevOps
PaaS
Red Hat Linux
Kubernetes
Docker
Configuration Management
Ansible
Terraform
Spring Framework
Cloud Security
J2EE
Java
Concurrent Programming
Unit Testing
Test-driven Development
Behavior-driven Development
JUnit
Mockito
Cucumber
Selenium
Jasmine
Visualization
Grafana
Open Source
Apache Kafka
Market Analysis
Law

Job Details

The Digital Modernization Sector has an opening for a Kafka Cloud Architect to work in Woodlawn, MD.

This position will require onsite work in Woodlawn, MD five days a week

Day to Day Responsibilities:
  • Lead and organize a team of Kafka administrators and developers, assign tasks, and facilitate weekly Kafka Technical Review meetings with the team.
  • Work alongside customer to determine expanded use of Kafka within the Agency.
  • Strategize within Leidos to set up opportunities to explore new technologies to use with Kafka.
  • Architect, design, code, and implement next-generation data streaming and event-based architecture / platform on Confluent Kafka.
  • Define strategy for streaming data to data warehouse, and integrating event-based architect with microservice based applications.
  • Establish Kafka best practices and standards for implementing the Kafka platform based on identified use cases and required integration patterns.
  • Mentor existing team members by imparting expert knowledge to build a high-performing team in our event-driven architecture. Assist developers in choosing correct patterns, event modelling and ensuring data integrity.
  • Provide software expertise in one or more of these areas: application integration, enterprise services, service-oriented architectures (SOA), security, business process management/business rules processing, data ingestion/data modeling.
  • Triage, investigate, advise in a hands-on capacity to resolve platform issues regardless of component.
  • Brief management, customer, team, or vendors using written or oral skills at appropriate technical level for audience. Share up-to-date insights on the latest Kafka-based solutions, formulate creative approaches to address business challenges, present and host workshops with senior leaders and translate technical jargons into layman's language and vice-versa.
  • All other duties as assigned or directed.

Foundation for Success (Required Qualifications):

This experience is the foundation a candidate needs to be successful in this position:
  • Bachelor's Degree in Computer Science, Mathematics, Engineering, or a related field with 12 years of relevant experience OR Master degree with 10 years of relevant experience. Additional years of experience may be substituted/accepted in lieu of degree.
  • 12+ years of experience with modern software development including systems/application analysis and design.
  • 7+ years of combined experience with Kafka (One or more of the following: Confluent Kafka, Apache Kafka, and/or Amazon MSK).
  • 2+ years of combined experience with designing, architecting, and deploying to AWS cloud platform.
  • 1+ years of leading a technical team.
  • Must be able to obtain and maintain a Public Trust security clearance.

Factors to Help You Shine (Required Qualifications):

These skills will help you succeed in this position:
  • Expert experience with Confluent Kafka with hands-on production experience, capacity planning, installation, administration / platform management, and a deep understanding of the Kafka architecture and internals.
  • Expert Experience in Kafka cluster, security, disaster recovery, data pipeline, data replication and/or performance optimization.
  • Kafka installation & partitioning on OpenShift or Kubernetes, topic management, HA & SLA architecture.
  • Strong knowledge and application of microservice design principles and best practices: distributed systems, bounded contexts, service-to-service integration patterns, resiliency, security, networking, and/or load balancing in large mission critical infrastructure.
  • Expert experience with Kafka Connect, KStreams, and KSQL, with the ability to know how to use effectively for different use cases.
  • Hands-on experience with scaling Kafka infrastructure including Broker, Connect, ZooKeeper, Schema Registry, and/or Control Center.
  • Hands-on experience in designing, writing, and operationalizing new Kafka Connectors.
  • Solid experience with data serialization using Avro and JSON and data compression techniques.
  • Experience with AWS services such as ECS, EKS, Flink, Amazon RDS for PostgreSQL, and/or S3.
  • Basic knowledge of relational databases (PostgreSQL, DB2, or Oracle), SQL, and ORM technologies (JPA2, Hibernate, and/or Spring JPA).

How to Stand Out from the Crowd (Desired Qualifications):

Showcase your knowledge of modern development using data streaming and event-based architecture through the following experience or skills:
  • AWS cloud certifications.
  • Delivery (CI/CD) best practices and use of DevOps to accelerate quality releases to Production.
  • PaaS using Red Hat OpenShift/Kubernetes and Docker containers.
  • Experience with configuration management tools (Ansible, CloudFormation / Terraform).
  • Solid experience with Spring Framework (Boot, Batch, Cloud, Security, and Data).
  • Solid knowledge with Java EE, Java generics, and concurrent programming.
  • Solid experience with automated unit testing, TDD, BDD, and associated technologies (Junit, Mockito, Cucumber, Selenium, and Karma/Jasmine).
  • Working knowledge of open-source visualization platform Grafana and open-source monitoring system Prometheus and uses with Kafka.

Original Posting:
May 2, 2025
For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.

Pay Range:
Pay Range $126,100.00 - $227,950.00

The Leidos pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.