Overview
On Site
Contract - W2
Contract - 12
Skills
GCP
Hadoop
python
big data
Job Details
Hi
Please find the JD below
Big Data Engineer
Location: Chandler, AZ (Hybrid)
W2 Candidates Only
8+ Years of Experience
Must Have: Hadoop, Big Data ecosystem, Python, Google Cloud Platform
Job Description:
We're seeking an experienced Big Data Engineer to design, develop, and maintain scalable data pipelines, integrate cloud services, and enable actionable insights from large-scale datasets.
Responsibilities:
- Design & develop data pipelines using Hadoop, Hive, PySpark, Python
- Integrate data services with AWS S3 ensuring security & compliance
- Perform data modeling & relational database design (MySQL or similar)
- Manage job scheduling with Autosys
- Automate workflows using Spark, Python, Unix/Shell scripting
- Collaborate with stakeholders to deliver insights
- Utilize Power BI & Dremio for visualization & exploration
- Participate in CI/CD pipeline development & deployment
- Troubleshoot data pipeline issues
Required Skills & Experience:
- 8+ years in software/data engineering
- Strong knowledge of Hadoop ecosystem & big data frameworks
- Hands-on experience with AWS S3, data modeling, Autosys
- Proficient in Unix/Shell scripting & CI/CD
- Familiarity with Power BI & Dremio
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.