Pittsburgh, Pennsylvania
•
18d ago
Position: Hadoop / ETL Developer with PySpark Location: Pittsburgh, PA (Day 1 Onsite) Duration: Full Time Required Qualifications: Bachelor s degree in Computer Science, Information Technology, or related field. 8+ years of experience in Big Data engineering and DevOps practices. Advanced proficiency in HDFS, Hive, Impala, PySpark, Python, and Linux. Proven experience with CI/CD tools such as Jenkins and uDeploy. Strong understanding of ETL development, orchestration, and performance optimizatio
Easy Apply
Full-time
Depends on Experience