Data Engineer – Google Cloud Platform

Remote • Posted 10 hours ago • Updated 10 hours ago
Contract Independent
Contract W2
No Travel Required
Remote
Depends on Experience
Fitment

Dice Job Match Score™

👾 Reticulating splines...

Job Details

Skills

  • Google Cloud
  • Data Warehouse
  • Data Architecture
  • Data Flow
  • Cloud Computing

Summary

Job Summary

KANINI is seeking a highly skilled Data Engineer with deep expertise in Google Cloud Platform (Google Cloud Platform) and modern data architecture. The ideal candidate will have hands-on experience designing scalable data pipelines, implementing Medallion Architecture, and building robust enterprise-grade data solutions.

This role requires strong technical proficiency in BigQuery, PySpark, Dataflow, and Airflow, along with a solid understanding of cloud data governance, performance optimization, and CI/CD practices.


Key Responsibilities

  • Design, develop, and maintain scalable batch and real-time data pipelines on Google Cloud Platform

  • Implement and manage Medallion Architecture (Bronze, Silver, Gold layers) for data processing

  • Build high-performance data transformations using Python and PySpark

  • Develop and optimize complex SQL queries for analytical workloads

  • Work extensively with BigQuery for large-scale data processing and performance tuning

  • Develop and deploy pipelines using Cloud Dataflow

  • Orchestrate workflows using Cloud Composer (Apache Airflow)

  • Manage data storage and lifecycle using Google Cloud Storage (GCS)

  • Implement version control and CI/CD pipelines using Git-based tools

  • Ensure data security, governance, and access control using Google Cloud Platform IAM

  • Optimize data solutions for performance, scalability, reliability, and cost-efficiency


Required Skills & Experience

  • Strong hands-on experience with Google Cloud Platform (Google Cloud Platform)

  • Expertise in BigQuery (partitioning, clustering, query optimization)

  • Proven experience implementing Medallion Data Architecture

  • Strong programming skills in Python and PySpark

  • Advanced proficiency in SQL (complex joins, window functions, performance tuning)

  • Hands-on experience with Cloud Dataflow

  • Experience with Cloud Composer (Airflow) for orchestration

  • Experience working with Google Cloud Storage (GCS)

  • Knowledge of version control systems (Git) and CI/CD practices

  • Strong understanding of Google Cloud Platform IAM and cloud security best practices


Preferred Qualifications

  • Experience working with large-scale enterprise data platforms

  • Knowledge of data warehousing and data lake concepts

  • Familiarity with real-time streaming frameworks

  • Experience in data governance and data quality frameworks

  • Exposure to Agile/Scrum methodologies

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10125806
  • Position Id: 8933846
  • Posted 10 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

Today

Easy Apply

Full-time, Contract

Remote

11d ago

Easy Apply

Contract

$50 - $60

Remote

26d ago

Easy Apply

Contract

Depends on Experience

Remote

19d ago

Easy Apply

Contract

Depends on Experience

Search all similar jobs