Google Cloud Platform data engineer With Google Certifications || Independent Visa

Remote • Posted 1 hour ago • Updated 1 hour ago
Contract W2
Contract Independent
Contract Corp To Corp
No Travel Required
Remote
Depends on Experience
Fitment

Dice Job Match Score™

✨ Finding the perfect fit...

Job Details

Skills

  • GCP
  • GCP Data Engineer
  • Snowflake
  • Google Cloud Platform
  • Migration
  • SQL
  • ETL
  • Healthcare

Summary

Job Summary
We are looking for a highly skilled Data Engineer with strong expertise in Snowflake and Google Cloud Platform (Google Cloud Platform) to design, build, and optimize scalable data platforms and analytics solutions. The role involves developing robust data pipelines, managing cloud data warehouses, and enabling high performance analytics for business and reporting needs.

Key Responsibilities
Snowflake Data Engineering

Design, develop, and maintain Snowflake data warehouse solutions
Implement and optimize Snowflake objects including databases, schemas, tables, views, and stages
Develop and manage Snowflake SQL, stored procedures, tasks, and streams
Optimize query performance, storage, and compute usage
Implement data sharing, security roles, and access controls in Snowflake
Support data modeling for analytical and reporting use cases

Google Cloud Platform Data Engineering
Design and build end to end data pipelines on Google Cloud Platform
Develop ETL/ELT pipelines using BigQuery, Cloud Storage, Dataflow / Dataproc
Integrate data from multiple sources (applications, APIs, files, streaming sources)
Ensure scalability, reliability, and cost optimization of cloud data solutions
Apply best practices for data governance, security, and compliance on Google Cloud Platform

Data Integration & Modeling
Perform data ingestion, transformation, and validation
Design dimensional and analytical data models for reporting and BI
Handle structured and semi structured data (CSV, JSON, Parquet, etc.)
Ensure data quality checks, reconciliation, and monitoring

Collaboration & Delivery
Work closely with analytics, reporting, and business teams to understand data requirements
Support UAT, production deployments, and ongoing enhancements
Document data pipelines, models, and technical design
Participate in Agile ceremonies and sprint-based delivery

Required Skills & Experience
Technical Skills

Strong hands-on experience with Snowflake
Strong hands-on experience with Google Cloud Platform (Google Cloud Platform)

BigQuery, Cloud Storage, Dataflow / Dataproc
Advanced SQL (performance tuning, complex queries)
Experience with ETL / ELT frameworks
Data modeling experience (dimensional / analytical)
Experience with version control tools (Git)

Good to Have
Python (or similar) for data processing and automation
Experience with orchestration tools (e.g., Airflow / Cloud Composer)
Experience working with BI tools (Looker, Tableau, Power BI, Qlik)
Exposure to CI/CD for data pipelines
Healthcare / Financial / Large enterprise data platform experience
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91098872
  • Position Id: 5371-10115-
  • Posted 1 hour ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

14d ago

Easy Apply

Contract

$60 - $75

Remote

2d ago

Easy Apply

Contract

Depends on Experience

Remote

30+d ago

Easy Apply

Contract

Depends on Experience

Remote

Yesterday

Easy Apply

Third Party, Contract

Depends on Experience

Search all similar jobs