We are seeking a Google Cloud Platform Data Engineer with deep, hands-on architectural and development experience in Google Cloud Platform s big data ecosystem.
You will be responsible for designing, building, and optimizing a modern data lakehouse architecture.
Your primary focus will be leveraging BigLake, BigQuery, Google Cloud Storage (GCS), and Vertex AI to create seamless, scalable data pipelines and machine learning integrations that drive business intelligence and predictive analytics.
Required Qualifications
Experience: 5+ years of dedicated Data Engineering experience, with at least 3+ years
focused exclusively on the Google Cloud Platform (Google Cloud Platform).
Deep Google Cloud Platform Big Data Expertise:
BigQuery: Expert-level knowledge of BigQuery architecture, advanced SQL,
analytical functions, query profiling, and optimization techniques.
BigLake: Proven experience utilizing BigLake for multi-cloud or lakehouse
architectures, managing open-source formats (e.g., Apache Iceberg/Parquet),
and enforcing unified security policies.
GCS: Deep understanding of GCS storage classes, object lifecycle management,
and optimizing GCS for big data workloads.
Vertex AI: Hands-on experience with Vertex AI pipelines, endpoints, feature
stores, or deploying ML models into scalable data environments.
Programming Skills: Advanced proficiency in Python and SQL. Familiarity with Java,
Scala, or Go is a plus.
Data Orchestration & CI/CD: Experience with orchestration tools (e.g., Apache Airflow,
Cloud Composer) and modern CI/CD pipelines (e.g., GitHub Actions, Terraform, Cloud
Build).
Preferred/Bonus Qualifications
Google Cloud Platform Certifications: Google Cloud Certified - Professional Data Engineer or
Professional Machine Learning Engineer.