Overview
Skills
Job Details
Position Summary
Client is looking for a seasoned technical leader and hands-on engineer with a track record of delivering large-scale, distributed service implementations. The ideal candidate is passionate about developing modern, mission-critical systems that are scalable, resilient, and aligned with business goals.
Key Responsibilities
Develop a deep understanding of business processes, workflows, and data dependencies.
Collaborate with project managers to define scope and gather requirements.
Contribute to technical specifications and documentation deliverables.
Partner with enterprise and application architects to design new solutions and enhance existing systems.
Build, configure, support, and document both new and legacy environments.
Create and maintain disaster recovery plans; participate in DR testing to ensure SLAs are met.
Plan and apply software patches for applications.
Design, implement, and maintain CI/CD pipelines, adhering to best practices.
Serve as a subject matter expert (SME) for assigned software packages.
Mentor and guide junior team members.
Build and maintain data pipelines and data assets in ElasticSearch.
Deliver robust search services with a focus on data quality.
Design and refine data flows using best practices.
Improve and optimize existing data flow processes.
Create and manage Vault policies and evaluate Vault token usage.
Configure and manage Vault secrets engines.
Provide on-call production support as needed.
Required Qualifications
Bachelor s degree in Computer Science or related field.
3+ years of hands-on experience with at least one of the following: Apache NiFi, ElasticSearch, or HashiCorp Vault, including installation, configuration, log analysis, and performance tuning.
2-4 years of experience supporting middleware or packaged applications.
Experience administering Java application servers such as WebSphere, JBoss, Tomcat, or WebLogic.
Strong understanding of Red Hat Linux and supporting enterprise software solutions in that environment.
Familiarity with containerization tools like Docker and orchestration platforms like Kubernetes (highly preferred).
Knowledge of modern DevOps practices and tools such as Jenkins, Ansible, Git, SonarQube, Artifactory, and Azure DevOps (preferred).
Solid grasp of network protocols and standards including DHCP, DNS, SSL, TCP/UDP, IP, QoS, and ICMP.
Experience with load balancing, routing, and general network troubleshooting.
Proficiency in scripting with tools such as Ansible, Bash, Perl, PowerShell, or Python.
Experience with Application Performance Monitoring (APM) tools like Dynatrace is a plus.
Strong analytical and problem-solving skills.
Excellent verbal and written communication skills.
Self-starter who works well independently and as part of a team.
Adaptable and eager to work across different technologies and platforms.