Google Cloud Platform Data Engineer

Overview

Hybrid
$50 - $70
Contract - W2
Contract - 12 Month(s)

Skills

Data Engineer
Java
Python
GCP
Big Query
Big Table
Google Cloud Storage
PubSub
Data Fusion
Dataflow
Dataproc
Teradata
Terraform
Agile

Job Details

Software Engineer Senior -315309

Duration: 12 Months

Location: Dearborn, MI

Position Description:

GDIA Data Engineer Job Description A position is open for a data engineer in the GDI&A Customer 360.The successful candidate will be responsible for designing, developing the transformation and modernization of big data solutions on Google Cloud Platform cloud integrating native Google Cloud Platform services and 3rd party data technologies also build new data products in Google Cloud Platform. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design and develop right solutions with appropriate combination of Google Cloud Platform and 3rd party technologies for deploying on Google Cloud Platform cloud. Responsibilities Key Responsibilities: Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successfully deployment of Data Platform Implement methods for automation of all parts of the pipeline to minimize labor in development and production. Identify, develop, evaluate, and summarize Proof of Concepts to prove out solutions Test and compare competing solutions and report out a point of view on the best solution Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (Google Cloud Platform) services: o BigQuery, DataFlow (Apache Beam), Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer (Apache Airflow), Cloud SQL, Compute Engine, Cloud Functions, and App Engine Migrate existing Big Data pipelines into Google Cloud Platform . Build new data products in Google Cloud Platform.

Skills Required: Minimum 3 Years of Experience in Java/python in-depth Minimum 2 Years of Experience in data engineering pipelines/ building data warehouse systems with ability to understand ETL principles and write complex sql queries. Minimum 5 Years of Google Cloud Platform experience working in Google Cloud Platform based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc Minimum 2 years of experience in development using Data warehousing, Big Data Eco System Hive (Hql) & Oozie Scheduler, ETL IBM Data Stage, Informatica IICS with Teradata 1 Year experience of deploying google cloud services using Terraform.

Skills Preferred: Understands Cloud as being a way to operate and not a place to host systems Understands data architectures and design independent of the technology Experience with Python, Shell Script preferred Exceptional problem solving and communication skills and management of multiple stakeholders Experience in working with Agile and Lean methodologies Experience with Test-Driven Development.

Experience Required: Minimum 3 Years of Experience in Java/python in-depth Minimum 2 Years of Experience in data engineering pipelines/ building data warehouse systems with ability to understand ETL principles and write complex sql queries. Minimum 5 Years of Google Cloud Platform experience working in Google Cloud Platform based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc Minimum 2 years of experience in development using Data warehousing, Big Data Eco System Hive (Hql) & Oozie Scheduler, ETL IBM Data Stage, Informatica IICS with Teradata 1 Year experience of deploying google cloud services using Terraform.

Education Required: Bachelors or masters in required field.

Additional Information : Onsite and Hybrid Position.

About Xoriant Corporation