Overview
Remote
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 18 Month(s)
No Travel Required
Able to Provide Sponsorship
Skills
Big Data engineering
DevOps practices
HDFS
Hive
Impala
PySpark
Python
Linux
CI/CD tools
Jenkins
uDeploy
ETL development
orchestration
performance optimization.
ServiceNow
Lead Big Data Engineer
Senior Data Operations & DevOps Engineer
Big Data & ETL DevOps Specialist
Job Details
We are seeking a highly experienced Senior Big Data & DevOps Engineer to manage end-to-end data operations for enterprise-scale platforms. The ideal candidate will have 8+ years of experience in Big Data technologies, ETL development, and DevOps automation, with hands-on expertise in HDFS, Hive, Impala, PySpark, Python, Jenkins, and uDeploy. This role is critical in ensuring the stability, scalability, and efficiency of data platforms while enabling smooth development-to-production workflows.
Required Qualifications
- Bachelor s degree in Computer Science, IT, or related field.
- 8+ years of experience in Big Data engineering and DevOps practices.
- Advanced proficiency in HDFS, Hive, Impala, PySpark, Python, and Linux.
- Hands-on experience with CI/CD tools such as Jenkins and uDeploy.
- Strong understanding of ETL development, orchestration, and performance optimization.
- Experience with ServiceNow for incident/change/problem management.
- Excellent analytical, troubleshooting, and communication skills.
Nice to Have
- Exposure to cloud-based Big Data platforms (e.g., AWS EMR).
- Familiarity with containerization (Docker, Kubernetes) and infrastructure automation tools (Ansible, Terraform).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.