Overview
On Site
Accepts corp to corp applications
Contract - Long Term
Skills
SQL
Python
GCP
PYSPARK
BigQuery
Cloud Dataproc
Cloud Dataflow
Cloud Composer
Cloud Pub/Sub
Job Details
Position: Google Cloud Platform Data Engineer
Location: Dallas, TX (Remote)
Duration: Contract
Must Have: Google Cloud Platform, Cloud Pub/Sub, BigQuery, SQL, PySpark, Cloud Composer, Python, Cloud Dataproc, Cloud Dataflow
Job Summary:
- The Data Engineers are responsible for designing, developing, and maintaining scalable data pipelines and infrastructure on Google Cloud Platform (Google Cloud Platform).
Responsibilities:
- Bachelor's degree in Computer science or equivalent, with minimum 12+ Years of relevant experience.
- Google Cloud Platform Data Engineer will create, deliver, and support custom data products, as well as enhance expand team capabilities.
- They will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and analytics.
- Google Cloud Data Engineers will be responsible for designing the transformation and modernization on Google Cloud Platform using Google Cloud Platform Services Responsibilities Build data systems and pipelines on Google Cloud Platform Cloud using Data proc, Data Flow, Big query and Pub Sub Strong programming knowledge in Python Implement schedules workflows and tasks for Cloud Composer Apache Airflow.
- Create and manage data storage solutions using Google Cloud Platform services such as BigQuery, Cloud Storage, and Cloud SQL Monitor and troubleshoot data pipelines and storage solutions using Google Cloud Platforms Stackdriver and Cloud Monitoring Develop efficient ETL ELT pipelines and orchestration using Data Prep, Google Cloud Composer Develop and Maintain Data Ingestion and transformation process using Apache PySpark, Dataflow Automate data processing tasks using scripting languages such as Python or Bash Ensuring data security and compliance with industry standards by configuring IAM roles, service accounts, and access policies.
- Automating cloud deployments Participate in Code reviews, contribute to development best practices and usage of Developer Assist tools to create a robust fail-safe data pipelines Collaborate with Product Owners, Scrum Masters and Data Analyst to deliver the User Stories and Tasks and ensure deployment of pipelines Good to have knowledge in SAS Coding, working experience in Teradata.
With Regards
Vijay,
Lorven Technologies, Inc.
101 Morgan Lane | Suite 209 | Plainsboro | NJ 08536
Tel: - * 239 |Fax:
Email: :
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.