Overview
On Site
Depends on Experience
Full Time
Skills
Hadoop
ETL
PySpark
Job Details
Position: Hadoop / ETL Developer with PySpark
Location: Pittsburgh, PA (Day 1 Onsite)
Duration: Full Time
Required Qualifications:
Bachelor s degree in Computer Science, Information Technology, or related field.
8+ years of experience in Big Data engineering and DevOps practices.
Advanced proficiency in HDFS, Hive, Impala, PySpark, Python, and Linux.
Proven experience with CI/CD tools such as Jenkins and uDeploy.
Strong understanding of ETL development, orchestration, and performance optimization.
Experience with ServiceNow for incident/change/problem management.
Excellent analytical, troubleshooting, and communication skills.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.