Overview
Skills
Job Details
TECHNOGEN, Inc. is a Proven Leader in providing full IT Services, Software Development and Solutions for 15 years.
TECHNOGEN is a Small & Woman Owned Minority Business with GSA Advantage Certification. We have offices in VA; MD & Offshore development centers in India. We have successfully executed 100+ projects for clients ranging from small business and non-profits to Fortune 50 companies and federal, state and local agencies.
Job Title: Kafka Cloud Architect
Location: Woodlawn, MD
Duration: Longterm
Key Required Skills: Confluent Kafka, Apache Flink, Kafka Connect, Python, Java and Spring Boot.
Position Description:
- Lead and organize a team of Kafka administrators and developers, assign tasks, and facilitate weekly Kafka Technical Review meetings with the team.
- Work alongside customer to determine expanded use of Kafka within the Agency.
- Strategize within Leidos to set up opportunities to explore new technologies to use with Kafka.
- Architect, design, code, and implement next-generation data streaming and event-based architecture / platform on Confluent Kafka.
- Define strategy for streaming data to data warehouse, and integrating event-based architect with microservice based applications.
- Establish Kafka best practices and standards for implementing the Kafka platform based on identified use cases and required integration patterns.
- Mentor existing team members by imparting expert knowledge to build a high-performing team in our event-driven architecture. Assist developers in choosing correct patterns, event modelling and ensuring data integrity.
- Provide software expertise in one or more of these areas: application integration, enterprise services, service-oriented architectures (SOA), security, business process management/business rules processing, data ingestion/data modeling.
- Triage, investigate, advise in a hands-on capacity to resolve platform issues regardless of component.
- Brief management, customer, team, or vendors using written or oral skills at appropriate technical level for audience. Share up-to-date insights on the latest Kafka-based solutions, formulate creative approaches to address business challenges, present and host workshops with senior leaders and translate technical jargons into layman's language and vice-versa.
Detailed Skills Requirements:
- Bachelor's degree in computer science, Mathematics, Engineering, or a related field with 12 years of relevant experience OR Master degree with 10 years of relevant experience.
- Additional years of experience may be substituted/accepted in lieu of degree.
- 12+ years of experience with modern software development including systems/application analysis and design.
- 7+ years of combined experience with Kafka (Confluent Kafka and/or Apache Kafka).
- 2+ years of combined experience with designing, architecting, and deploying to AWS cloud platform.
- 1+ years of leading a technical team.
FACTORS TO HELP YOU SHINE (Required Skills):
These skills will help you succeed in this position:
- Expert experience with Confluent Kafka with hands-on production experience, capacity planning, installation, administration / platform management, and a deep understanding of the Kafka architecture and internals.
- Expert in Kafka cluster and application security.
- Strong knowledge of Event Driven Architecture (EDA)
- Expert Experience in data pipeline, data replication and/or performance optimization.
- Kafka installation & partitioning on OpenShift or Kubernetes, topic management, HA & SLA architecture.
- Strong knowledge and application of microservice design principles and best practices: distributed systems, bounded contexts, service-to-service integration patterns, resiliency, security, networking, and/or load balancing in large mission critical infrastructure.
- Expert experience with Kafka Connect, KStreams, and KSQL, with the ability to know how to use effectively for different use cases.
- Hands-on experience with scaling Kafka infrastructure including Broker, Connect, ZooKeeper, Schema Registry, and/or Control Center.
- Hands-on experience in designing, writing, and operationalizing new Kafka Connectors.
- Solid experience with data serialization using Avro and JSON and data compression techniques.
- Experience with AWS services such as ECS, EKS, Flink, Amazon RDS for PostgreSQL, and/or S3.
- Basic knowledge of relational databases (PostgreSQL, DB2, or Oracle), SQL, and ORM technologies (JPA2, Hibernate, and/or Spring JPA).
HOW TO STAND OUT FROM THE CROWD (Desired Skills):
Showcase your knowledge of modern development through the following experience or skills:
- Disaster recovery strategy
- Domain Driven Design (DDD)
- AWS cloud certifications.
- Delivery (CI/CD) best practices and use of DevOps to accelerate quality releases to Production.
- PaaS using Red Hat OpenShift/Kubernetes and Docker containers.
- Experience with configuration management tools (Ansible, CloudFormation / Terraform).
- Solid experience with Spring Framework (Boot, Batch, Cloud, Security, and Data).
- Solid knowledge with Java EE, Java generics, and concurrent programming.
- Solid experience with automated unit testing, TDD, BDD, and associated technologies (Junit, Mockito, Cucumber, Selenium, and Karma/Jasmine).
- Working knowledge of open-source visualization platform Grafana and open-source monitoring system Prometheus and uses with Kafka
Education:
- Bachelor's Degree in with 12+ years of experience
- Must be able to obtain and maintain a Public Trust security clearance.