Google Cloud Platform Data Engineer

Hybrid in Irving, TX, US • Posted 20 days ago • Updated 20 days ago
Full Time
No Travel Required
On-site
Depends on Experience
Fitment

Dice Job Match Score™

📋 Comparing job requirements...

Job Details

Skills

  • Spark
  • GCP
  • Bigquery
  • Migration
  • ETL
  • Data Modeling
  • Pyspark
  • Data Engineering
  • SQL
  • Python
  • Airflow

Summary

Job Description:

Summary: We are hiring a Senior Data Engineer with cloud migration experience and Big Data tools.

Key Responsibilities:

  • Design and develop ETL/ELT workflows and data pipelines for batch and real-time processing.
  • Build and maintain data pipelines for reporting and downstream applications using open-source frameworks and cloud technologies.
  • Implement operational and analytical data stores leveraging Delta Lake and modern database concepts.
  • Optimize data structures for performance and scalability across large datasets.
  • Collaborate with architects and engineering teams to ensure alignment with target state architecture.
  • Apply best practices for data governance, lineage tracking, and metadata management, including integration with Google Dataplex for centralized governance and data quality enforcement.
  • Develop, schedule, and orchestrate complex workflows using Apache Airflow, with strong proficiency in designing and managing Airflow DAGs.
  • Troubleshoot and resolve issues in data pipelines and ensure high availability and reliability.

Required Technical Skills

  • Strong Understanding of Data: Data structures, modeling, and lifecycle management.
  • ETL/ELT Expertise: Hands-on experience designing and managing data pipelines.
  • PySpark: Advanced skills in distributed data processing and transformation.
  • NetApp Iceberg: Experience implementing open table formats for analytics.
  • Hadoop Ecosystem: Knowledge of HDFS, Hive, and related components.
  • Cloud Platforms: Google Cloud Platform (BigQuery, Dataflow), Delta Lake, and Dataplex for governance and metadata management.
  • Programming & Orchestration: Python, Spark, SQL.
  • Workflow Orchestration: Strong experience with Apache Airflow, including authoring and maintaining DAGs for complex workflows.
  • Database & Reporting Concepts: Strong understanding of relational and distributed systems.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10114908
  • Position Id: 8884034
  • Posted 20 days ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Irving, Texas

Today

Easy Apply

Full-time

USD 70.00 - 75.00 per hour

Hybrid in Dallas, Texas

Today

Easy Apply

Full-time

Depends on Experience

Remote

5d ago

Easy Apply

Contract, Third Party

$70 - $75

Remote

4d ago

Easy Apply

Full-time

130,000 - 140000

Search all similar jobs