Overview
Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - 12 Month(s)
Skills
GCP
Data ENgineer
Spark
Pyspark
Hadoop
Data Pipeline
Big Query
Dataproc
Job Details
Role: Google Cloud Platform Big Data Engineer
Location: Sunnyvale, CA (Hybrid)/ Bentonville AR (Hybrid)
Duration: 12+ Months
9+ Years
Must have:
- Hadoop 5+ Yrs of Exp
- Scala 4+ Yrs of Exp
- Google Cloud Platform 3+ Yrs of Exp
- Hive 4+Yrs of Exp
- SQL - 3+ Yrs of Exp
- ETL Process / Data Pipeline - 4+ Years of experience
Requirements:
- 5 + years of experience in Hadoop/Big Data.
- 2+ years of experience in strategic data planning, standards, procedures, and governance.
- 4+ years of hands-on experience in Python or Scala.
- 3+ years of experience in writing and tuning SQLs, Spark queries.
- Experience in understanding and managing Hadoop Log Files, Hadoop multiple data processing engines.
- Experience in analysing data in HDFS through Map Reduce, Hive and Pig.
- Experience with scripting languages: Python, Scala, etc.
- 3+ experience in Google Cloud Platform environment.
Thanks
Yashasvi Hasija
Technical Recruiter | Empower Professionals
......................................................................................................................................
| Phone: x 368 | Fax:
LinkedIn:
|
Certified NJ and NY Minority Business Enterprise (NMSDC)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.