Overview
Hybrid
$55 - $60
Contract - W2
Contract - 12 Month(s)
Skills
Big Data
HDFS
Linux
SQL
Unix
Python
Job Details
Job Description:
We are seeking a skilled Data Engineer with expertise in Big Data technologies to join our client in San Jose, CA. This role requires a long-term W2 contract and onsite presence 3 days per week.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines and ETL workflows.
- Work extensively with Big Data technologies including Hadoop, HDFS, Hive, and Spark SQL.
- Develop and optimize Python scripts for data processing and automation.
- Write and maintain Unix/Linux shell scripts for data workflows and automation.
- Collaborate with cross-functional teams to support data-driven decision making.
Required Skills:
- Strong experience in Python programming.
- Proficient with Hadoop ecosystem: Hadoop, HDFS, Hive, Spark SQL.
- Solid understanding of Unix/Linux operating systems and shell scripting.
- Extensive ETL experience; hands-on experience with Ab Initio preferred.
- Ability to work onsite in San Jose, CA at least 3 days per week.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.