Position Title: Software Engineer Lead - Contractor
Position Location: Strongsville, OH 44136 (or) Pittsburgh, PA 15222
Ability to work remote: 5 days onsite.
Acceptable time zone(s): EST
Days of the week: M-F 40 hours
Working Hours: M-F 8-5 EST
OT: Yes, possible
Travel: No
Potential for Contract Extension: Yes
**This position is contract with the right to hire if a need becomes available. Manager will only look at candidates that are open to converting to a full-time employee.
Function of the Group/Initiatives/Projects: We are part of a data platform in retail. DSPCOE Industry background: Finance/Banking a plus, open to other backgrounds
Team Dynamic: Business analysts and testing teams at least nine on team that will be joining, team in GCC offshore and interact with the different businesses.
Roles/Responsibilities:
- Multiple years of experience in software development with strong focus on Java / J2EE technologies.
- Proven experience in programming microservices-based applications, Kafka, Kstreams and Flink.
- Strong knowledge of Spring Boot, Spring Cloud, Hibernate, REST APIs.
- Hands-on experience writing queries with Oracle and MSSQL Databases.
- Good understanding of containerization technologies like OCP, Docker, Kubernetes
- Experience with CI/CD pipelines, Git, Jenkins, and automated testing tools.
- Strong problem-solving skills and ability to lead technical teams.
- Provides technical guidance and support to colleagues and solution development.
Preferred Skills:
- Certifications in Kafka, Java technologies.
Experience with event-driven architecture, messaging systems like Kafka or RabbitMQ.
- Exposure to SAFE Agile/Scrum methodologies
Understanding of Kafka architecture (brokers, partitions, topics, producers, consumers) (High level)
- Experience with Kafka Producers and Consumers using the Kafka Java client
- Knowledge of Kafka topic configurations (retention, replication, partitioning) (High level)
- Understanding of the Kafka Streams
Distributed Processing Concepts (Just a high level)
- Familiarity with event-driven architecture
- Knowledge of exactly-once processing vs at-least-once processing
- Understanding of stream-table duality (Kafka Streams vs. KTables)
- Schema Management
- Experience with Avro, Protobuf, or JSON for structured messages
Integration with External Systems
- Connecting Kafka Streams with databases (PostgreSQL, MongoDB, Cassandra)
- Using Kafka Connect for external data integration
- Knowledge of REST APIs and how to expose data from Kafka Streams
DevOps and Deployment*
- Familiarity with Docker and Kubernetes for containerized deployment
- Using CI/CD pipelines (GitHub Actions, Jenkins, GitLab CI)
- Logging and tracing using ELK (Elasticsearch, Logstash, Kibana) or OpenTelemetry (High level understanding)
Testing Kafka Streams Applications
- Writing unit tests with Mockito and JUnit
- Using TestContainers for integration testing with Kafka
- Validating Kafka Streams topologies using TopologyTestDriver
API developers:
- Experience building REST APIs using Spring Boot
- Experience with Spring Data/Spring Data JPA for connecting to and reading from databases via APIs
- Experience writing unit tests using JUnit/Spock
- Familiarity with CI/CD pipelines using Jenkins
- Familiarity with SQL/NoSQL databases
Nice-to-have Skills:
- Monitoring and Optimization
- Understanding of Kafka Streams metrics (through JMX, Grafana, Prometheus)
- Profiling performance and tuning configurations (buffer sizes, commit intervals)
- Handling out-of-order events and rebalancing issues
- Knowledge of Apache Flink or KSQLDB for alternative stream processing
- Knowledge of Docker, OpenShift
- Experience with tools like Dynatrace for troubleshooting
MUST HAVE SKILLS:
- Kafka Expert level (see roles/responsibilities for in depth skill description)
- Java / J2EE technologies Expert Level
- Apache basics
- Strong knowledge of Spring Boot, Spring Cloud, Hibernate, REST APIs.
- Experience building REST APIs using Spring Boot
- Experience with Spring Data/Spring Data JPA for connecting to and reading from databases via APIs
- Experience writing unit tests using JUnit/Spock
- Familiarity with CI/CD pipelines using Jenkins
- Familiarity with SQL/NoSQL databases
FLEX SKILLS:
- Knowledge of REST APIs
- DevOps and Deployment
- Familiarity with Docker and Kubernetes for containerized deployment
- Using CI/CD pipelines (GitHub Actions, Jenkins, GitLab CI)
- Knowledge of Apache Flink or KSQLDB for alternative stream processing
- Knowledge of Docker, OpenShift
Soft Skills:
- Problem Solving Skills
- Ability to lead, mentor
Education:
- Bachelor s or relevant experience will be considered
Role Differentiator:
- Growth, Opportunity, modern technologies
Interview Process:
- 1st with manager, initial screen ~30 minutes
- 2nd round with technical team (panel) ~45-1 hour
- Possibility to 3rd round if needing further assessment
--
Renu
Technical Recruiter, AGM TECH SOLUTIONS
Eight four eight-eight zero zero-Eight four nine seven