Google Cloud Platform Data Engineer

Overview

Remote
$50 - $60
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 Month(s)
No Travel Required

Skills

GCP Data Engineer
Google Cloud Platform
BigQuery
Cloud Composer
Cloud Dataflow
Cloud Dataproc
Cloud SQL
Python
PySpark
Apache Spark
Apache NiFi
Hadoop
Hive
HDFS
REST API
ETL Pipelines
Data Integration
Data Architecture
Distributed Systems
Cloud Data Engineering
Data Processing
Cloud Analytics
Data Governance
Scalable Pipelines
Google Cloud
Data Warehouse

Job Details

Position: Google Cloud Platform Data Engineer
Location: Open (Remote or Onsite Options)
Duration: 12 Months

Overview:
We are seeking a highly skilled Google Cloud Platform Data Engineer to design, develop, and optimize large-scale data pipelines and analytics solutions on Google Cloud Platform. The ideal candidate has hands-on experience with Python, Apache Spark, BigQuery, and related Google Cloud Platform services, with a passion for building scalable data architectures that drive business insight and efficiency.

Key Responsibilities:

  • Design and implement high-performance data pipelines and transformation workflows using Google Cloud Platform services such as Cloud Dataflow, Dataproc, and Cloud Composer.
  • Develop and maintain BigQuery-based data models for analytics and reporting.
  • Integrate and process data from diverse sources using Python, PySpark, and Apache NiFi.
  • Collaborate with cross-functional teams to translate business requirements into robust data engineering solutions.
  • Optimize performance across data systems through effective partitioning, indexing, and query tuning.
  • Manage data quality, governance, and scalability within cloud-based environments.
  • Work closely with data scientists, analysts, and application teams to ensure seamless data access and reliability.
  • Ensure adherence to data security, regulatory compliance, and best practices in cloud-based architectures.

Required Skills & Experience:

  • Strong expertise in Python, PySpark, and SQL programming.
  • Deep understanding of Google Cloud Platform components including BigQuery, Cloud Composer, Dataflow, Dataproc, and Cloud SQL.
  • Experience with Apache Spark, Apache NiFi, and Hadoop-based ecosystems.
  • Expertise in designing, managing, and optimizing large-scale distributed data applications.
  • Familiarity with APIs, REST architecture, and integration techniques.
  • Strong analytical, problem-solving, and communication skills in collaborative settings.

Keywords: Google Cloud Platform Data Engineer, Google Cloud Platform, BigQuery, Cloud Composer, Cloud Dataflow, Cloud Dataproc, Cloud SQL, Python, PySpark, Apache Spark, Apache NiFi, Hadoop, Hive, HDFS, REST API, ETL Pipelines, Data Integration, Data Architecture, Distributed Systems, Cloud Data Engineering, Data Processing, Cloud Analytics, Data Governance, Scalable Pipelines, Google Cloud, Data Warehouse

About VDart Group
VDart Group is a global leader in technology, product, and talent solutions, serving Fortune 500 clients in 13 countries. With over 4,000 professionals worldwide, we deliver innovation, operational excellence, and measurable outcomes across industries. Guided by our commitment to People, Purpose, and Planet, VDart is recognized with an EcoVadis Bronze Medal and as a UN Global Compact member, reflecting our dedication to sustainable practices.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About VDart, Inc.