Overview
On Site
$50+
Contract - W2
Contract - 24 Month(s)
Skills
Bigdata
Hadoop
Spark
PySpark
Python
Java
Data Warehouse
SQL
MapReduce
Apache Hive
Unix
Job Details
Role: Bigdata Engineer
Location- Phoenix, AZ (Day one onsite)
Duration: 6+months
Minimum Qualifications:
- Masters in computer applications or equivalent OR bachelor's degree in engineering or computer science or equivalent.
- Deep understanding of Hadoop and Spark Architecture and its working principle.
- Deep understanding of Data warehousing concepts.
- Ability to design and develop optimized Data pipelines for batch and real time data processing.
- 5+ years of software development experience.
- 5+ years' experience on Python or Java Hands-on experience on writing and understanding complex SQL (Hive/PySpark-data frames), optimizing joins while processing huge amount of data.
- 3+ years of hands-on experience of working with Map-Reduce, Hive, Spark (core, SQL and PySpark).
- Hands on Experience on Google Cloud Platform (BigQuery, DataProc, Cloud Composer)
- 3+ years of experience in UNIX shell scripting
- Should have experience in analysis, design, development, testing, and implementation of system applications.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.