Data Architect Google Cloud Platform

Remote β€’ Posted 4 hours ago β€’ Updated 4 hours ago
Contract W2
Remote
$40 - $50/hr
Fitment

Dice Job Match Scoreβ„’

🧠 Analyzing your skills...

Job Details

Skills

  • Access Control
  • Agile
  • Analytical Skill
  • Apache Airflow
  • Cloud Computing
  • Cloud Security
  • Data Lake
  • Data Processing
  • Data Quality
  • Data Security
  • Data Storage
  • Continuous Delivery
  • Continuous Integration
  • Data Architecture
  • Data Flow
  • Python
  • Google Cloud
  • Google Cloud Platform
  • Management
  • Performance Tuning
  • PySpark
  • Clustering
  • Data Governance
  • Data Warehouse
  • GCS
  • Git
  • Good Clinical Practice
  • Query Optimization
  • Real-time
  • SQL
  • Scalability
  • Scrum
  • Storage
  • Workflow

Summary

seeking a highly skilled Data Engineer with deep expertise in Google Cloud Platform (Google Cloud Platform) and modern data architecture. The ideal candidate will have hands-on experience designing scalable data pipelines, implementing Medallion Architecture, and building robust enterprise-grade data solutions.

This role requires strong technical proficiency in BigQuery, PySpark, Dataflow, and Airflow, along with a solid understanding of cloud data governance, performance optimization, and CI/CD practices.

Key Responsibilities

  • Design, develop, and maintain scalable batch and real-time data pipelines on Google Cloud Platform
  • Implement and manage Medallion Architecture (Bronze, Silver, Gold layers) for data processing
  • Build high-performance data transformations using Python and PySpark
  • Develop and optimize complex SQL queries for analytical workloads
  • Work extensively with BigQuery for large-scale data processing and performance tuning
  • Develop and deploy pipelines using Cloud Dataflow
  • Orchestrate workflows using Cloud Composer (Apache Airflow)
  • Manage data storage and lifecycle using Google Cloud Storage (GCS)
  • Implement version control and CI/CD pipelines using Git-based tools
  • Ensure data security, governance, and access control using Google Cloud Platform IAM
  • Optimize data solutions for performance, scalability, reliability, and cost-efficiency

Required Skills & Experience

  • Strong hands-on experience with Google Cloud Platform (Google Cloud Platform)
  • Expertise in BigQuery (partitioning, clustering, query optimization)
  • Proven experience implementing Medallion Data Architecture
  • Strong programming skills in Python and PySpark
  • Advanced proficiency in SQL (complex joins, window functions, performance tuning)
  • Hands-on experience with Cloud Dataflow
  • Experience with Cloud Composer (Airflow) for orchestration
  • Experience working with Google Cloud Storage (GCS)
  • Knowledge of version control systems (Git) and CI/CD practices
  • Strong understanding of Google Cloud Platform IAM and cloud security best practices

Preferred Qualifications

  • Experience working with large-scale enterprise data platforms
  • Knowledge of data warehousing and data lake concepts
  • Familiarity with real-time streaming frameworks
  • Experience in data governance and data quality frameworks
  • Exposure to Agile/Scrum methodologies

Employers have access to artificial intelligence language tools (β€œAI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91159203
  • Position Id: 8940208
  • Posted 4 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

β€’

Today

Contract

80-100/hr

Remote

β€’

7d ago

Easy Apply

Contract

Depends on Experience

Remote

β€’

Today

Easy Apply

Contract

55 - 60

Remote

β€’

Today

Easy Apply

Full-time, Contract

Search all similar jobs