GCP Data Engineer

Remote • Posted 15 hours ago • Updated 15 hours ago
Contract Corp To Corp
Contract Independent
Contract W2
No Travel Required
Remote
Depends on Experience
Fitment

Dice Job Match Score™

👾 Reticulating splines...

Job Details

Skills

  • Data Engineering

Summary

Our client is looking for GCP Data Engineer at Denver, CO below is the detailed requirements. Please share your updated resume if you are interested.
 

Job Title: GCP Data Engineer

Location: Denver, CO

Employment Type: Full-Time / Contract

 

Job Description

We are seeking an experienced GCP Data Engineer to join our data engineering team onsite in Denver, CO. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and analytics solutions on Google Cloud Platform (GCP). This role requires strong hands-on experience with GCP native services, data modeling, and large-scale data processing.

 

Key Responsibilities

  • Design, develop, and maintain ETL/ELT data pipelines on GCP
  • Build and optimize data solutions using BigQuery, Cloud Dataflow, Dataproc, and Cloud Composer
  • Develop batch and streaming data pipelines using Apache Beam, Spark, and Pub/Sub
  • Implement data ingestion from multiple sources (APIs, databases, flat files, streaming systems)
  • Ensure data quality, reliability, performance, and security best practices
  • Collaborate with data scientists, analysts, and application teams to support analytics and reporting needs
  • Optimize query performance and cost management in BigQuery
  • Implement CI/CD pipelines and infrastructure automation using Terraform or Deployment Manager
  • Monitor and troubleshoot data pipelines in production environments
  • Follow data governance, compliance, and security standards

 

Required Skills & Qualifications

  • 6+ years of experience in Data Engineering
  • Strong hands-on experience with Google Cloud Platform (GCP)
  • Expertise in BigQuery, Cloud Storage, Pub/Sub, Dataflow, Dataproc
  • Strong programming skills in Python and/or Java
  • Experience with Apache Spark, Apache Beam
  • Solid understanding of data warehousing, data modeling, and SQL
  • Experience with Airflow / Cloud Composer
  • Familiarity with CI/CD pipelines and Infrastructure as Code (Terraform)
  • Experience working in Agile/Scrum environments

 

Nice to Have

  • GCP Professional Data Engineer Certification
  • Experience with real-time/streaming data pipelines
  • Knowledge of machine learning data pipelines on GCP
  • Experience with Looker or other BI tools
  • Healthcare, Finance, or Retail domain experience
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10120137
  • Position Id: 67007-10367-390528
  • Posted 15 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

13d ago

Easy Apply

Contract

Depends on Experience

Remote

26d ago

Easy Apply

Third Party, Contract

$70 - $75

Remote

4d ago

Easy Apply

Contract

50 - 55

Remote

Today

Easy Apply

Contract

55 - 60

Search all similar jobs