Senior Data Engineer Google Cloud Platform, BigQuery, Spark *** Direct End Client ***

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

Staff Data Engineer
data architecture
data modeling
ETL
data integration
SQL
BigQuery
Hadoop
Spark
Dataflow
Airflow
cloud platforms
GCP
Python
knowledge graph
Neo4j
data engineering
data warehousing
scalable data solutions
data pipelines
dimensional modeling
data analysis
data governance
data infrastructure
generative AI
Vertex AI
AWS Cloud Search
data-driven
technical leadership.
Amazon Web Services
Apache Hadoop
Apache Spark
Artificial Intelligence
Cloud Computing
Data Flow
Data Management
Data Processing
Data Quality
Data Warehouse
Extract
Transform
Load
Data engineer
DE
Google Cloud Platform
Cloud Composer
PySpark
Monte Carlo
data observability
ETL pipelines
ELT pipelines
data pipeline
data engineering jobs
cloud data engineer
data reliability
data lineage
CI/CD
Terraform
remote data engineer job
Apache Airflow
ELT
Extract
Transform
Load
Google Cloud

Job Details

Job Title: Senior Data Engineer – Google Cloud Platform, BigQuery, Spark, Monte Carlo

We are seeking a highly skilled and motivated Senior Data Engineer to join our data platform team. The ideal candidate will have hands-on experience with Google Cloud Platform (Google Cloud Platform) services, including BigQuery, Cloud Composer, Dataflow, and Apache Spark/PySpark, along with a strong understanding of data quality frameworks using Monte Carlo.

You will be responsible for designing, building, and maintaining scalable data pipelines, ensuring high data quality and reliability across our analytics and operational systems.

Key Responsibilities:

  • Design and implement scalable ETL/ELT pipelines using Dataflow, Composer, and Spark/PySpark
  • Optimize and manage large-scale datasets in BigQuery
  • Monitor and improve data quality using Monte Carlo
  • Collaborate with data scientists, analysts, and business stakeholders to deliver reliable data solutions
  • Automate workflows and orchestrate pipelines using Apache Airflow (via Composer)
  • Ensure data governance, lineage, and observability across the data ecosystem
  • Troubleshoot and resolve data pipeline issues in production environments

Required Qualifications:

  • 4+ years of experience in data engineering or related field
  • Strong proficiency in Google Cloud Platform services: BigQuery, Dataflow, Cloud Composer
  • Expertise in Apache Spark and PySpark
  • Experience with Monte Carlo or similar data observability tools
  • Proficiency in SQL and Python
  • Familiarity with CI/CD pipelines and version control (e.g., Git)
  • Excellent problem-solving and communication skills

Preferred Qualifications:

  • Experience with real-time data processing and streaming
  • Knowledge of data modeling and warehousing best practices
  • Familiarity with Terraform or Infrastructure as Code (IaC) tools
  • Certification in Google Cloud Platform Data Engineering is a plus

 

data engineer, Google Cloud Platform, Google Cloud Platform, BigQuery, Cloud Composer, Dataflow, Apache Spark, PySpark, Monte Carlo, data quality, data observability, Airflow, ETL pipelines, ELT pipelines, data pipeline, data engineering jobs, cloud data engineer, data reliability, data governance, data lineage, Python, SQL, CI/CD, Terraform, remote data engineer job

 

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.