Senior Google Cloud Platform Data Engineer

Overview

On Site
Depends on Experience
Full Time

Skills

Google Cloud Platform
Python
Scala
RDBMS
Apache Hadoop
Apache Hive
Big Data
Data Modeling
Extract
Transform
Load
GCS
Shell
Apache Spark

Job Details

Key Responsibilities:

Design and build scalable big data applications using open-source technologies like Spark, Hive, Kafka

Develop data pipelines and orchestrate workflows using Apache Airflow

Implement and optimize ETL/ELT pipelines in Google Cloud Platform (Dataproc, GCS, BigQuery)

Model and design schemas for data lakes and RDBMS platforms

Automate data workflows and manage multi-TB/PB scale datasets

Provide ongoing support, maintenance, and participate in on-call rotations

Collaborate with cross-functional teams to deliver clean, reliable data products

Required Skills:

5+ years of experience with Hadoop, Spark, Hive, Airflow, or equivalent big data tools

Proficiency in Scala, Python, and scripting languages like Shell or Perl

Strong experience with data modeling (logical and physical)

Hands-on Google Cloud Platform experience (Dataproc, GCS, BigQuery)

Knowledge of distributed systems, test-driven development, and automated testing frameworks.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.