Job Title: Hadoop Admin
Location: St. Louis, MO
Duration: 3-6 months contract
Essential responsibilities of the position:
- Work directly with application and development teams to completely understand requirements for project and ad hoc requests and provide accurate estimates.
- Support mission-critical Big Data environments for highly-available and contingent platforms.
- Create and maintain documentation and define best practices in support of proper change management, security and operational requirements.
- Investigate and understand new technologies and additional capabilities of current platforms that support clients vision of Big Data, leading their delivery adoption.
- Provide maintenance and operational support activities related to Big Data platforms per client standards and industry best practices.
Essential knowledge, skills and experience:
- Strong knowledge of Hadoop platforms and other distributed data processing platforms.
- Advanced Linux knowledge is a must. Understanding of shell, debugging, etc.
- Strong background in delivering mission-critical Big Data project work while interacting with diverse and experienced teammates across the spectrum of enterprise operational functions.
- Strong problem solving skills and able to engage in troubleshooting complex problems which require coordination across multiple teams.
- Experience with CI/CD (continuous integration / continuous integration) and infrastructure automation tools and techniques.
Desirable or additional capabilities:
- SQL knowledge a must. Experience with advanced data warehousing and MPP knowledge is a plus.
- Understanding of and experience with network engineering in support of large IT organizations.
- Background building and maintaining extract/transform/load (ETL) processes. Practical experience dealing with large ETL pipelines is a plus.
- Experience building and maintaining Big Data technologies on public or private cloud technologies.
Thanks & Regards,
Email : firstname.lastname@example.org