Google Cloud Platform Data Engineer

Overview

Hybrid
130,000 - 150,000
Full Time
Accepts corp to corp applications
No Travel Required
Unable to Provide Sponsorship

Skills

SQL
HIPAA
Batch Processing
Apache Beam
Data Engineering
Extract, Transform, Load
Google Cloud Platform

Job Details

Job Title: Senior Data Engineer – Google Cloud Platform Cloud

 

🌟 Role Overview

We’re looking for a highly skilled Senior Data Engineer with 6+ years of experience to architect, build, and optimize scalable data pipelines and infrastructure on Google Cloud Platform (Google Cloud Platform). You’ll be instrumental in transforming raw data into actionable insights, enabling advanced analytics, machine learning, and real-time decision-making across the enterprise.

 

🔍 Key Responsibilities

  • Design, develop, and maintain robust data pipelines using Google Cloud Platform-native tools (e.g., Dataflow, Pub/Sub, BigQuery, Cloud Composer)
  • Build and optimize ETL/ELT workflows for structured and unstructured data across multiple sources
  • Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality, reliable datasets
  • Implement data quality, observability, and lineage frameworks using tools like Dataplex, Data Catalog, and Looker
  • Ensure data security, privacy, and compliance with industry standards (e.g., GDPR, HIPAA, SOC 2)
  • Develop and maintain CI/CD pipelines for data workflows using Terraform, Git, and Cloud Build
  • Monitor and tune performance of data pipelines and storage systems for cost-efficiency and scalability
  • Support real-time streaming architectures and batch processing for analytics and ML use cases
  • Document technical designs, data flows, and operational procedures
  • Mentor junior engineers and contribute to engineering best practices

 

🛠️ Required Skills & Qualifications

  • 6+ years of experience in data engineering, with at least 2+ years working on Google Cloud Platform
  • Strong proficiency in Python, SQL, and Apache Beam or similar frameworks
  • Hands-on experience with BigQuery, Cloud Storage, Pub/Sub, Dataflow, and Cloud Composer
  • Solid understanding of data warehousing, data lakes, and lakehouse architectures
  • Experience with Airflow, dbt, or similar orchestration and transformation tools
  • Familiarity with DevOps for data: CI/CD, infrastructure-as-code, and version control
  • Knowledge of data governance, metadata management, and data security
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.