Job Description
We are seeking an experienced Google Cloud Platform Data Engineer to join our data engineering team onsite in Denver, CO. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and analytics solutions on Google Cloud Platform (Google Cloud Platform). This role requires strong hands-on experience with Google Cloud Platform native services, data modeling, and large-scale data processing.
Key Responsibilities
Design, develop, and maintain ETL/ELT data pipelines on Google Cloud Platform
Build and optimize data solutions using BigQuery, Cloud Dataflow, Dataproc, and Cloud Composer
Develop batch and streaming data pipelines using Apache Beam, Spark, and Pub/Sub
Implement data ingestion from multiple sources (APIs, databases, flat files, streaming systems)
Ensure data quality, reliability, performance, and security best practices
Collaborate with data scientists, analysts, and application teams to support analytics and reporting needs
Optimize query performance and cost management in BigQuery
Implement CI/CD pipelines and infrastructure automation using Terraform or Deployment Manager
Monitor and troubleshoot data pipelines in production environments
Follow data governance, compliance, and security standards
Required Skills & Qualifications
6+ years of experience in Data Engineering
Strong hands-on experience with Google Cloud Platform (Google Cloud Platform)
Expertise in BigQuery, Cloud Storage, Pub/Sub, Dataflow, Dataproc
Strong programming skills in Python and/or Java
Experience with Apache Spark, Apache Beam
Solid understanding of data warehousing, data modeling, and SQL
Experience with Airflow / Cloud Composer
Familiarity with CI/CD pipelines and Infrastructure as Code (Terraform)
Experience working in Agile/Scrum environments.