Google Cloud Platform Data Engineer

  • Dallas, TX
  • Posted 12 hours ago | Updated 12 hours ago

Overview

On Site
$110,000 - $130,000
Full Time
Accepts corp to corp applications

Skills

Pyspark
Python
Data Core
Compute Engine
Data Proc
Kubernetes Engine
Cloud storage
Big query
Spark SQL
Data frames
Pytest
Github
SQL Queries
GCP

Job Details

Title: Google Cloud Platform Data Engineer

Location: Richardson Dallas TX (Day 1 onsite)

Fulltime

Job Description

Need 12 + years of experience, Py Spark, Python, proactive monitoring, alert mechanism, and DataCore.

  • Experience with Google Cloud Platform services such as Compute Engine, Data Proc, Kubernetes Engine, Cloud Storage, Big query, PUB/SUB, Cloud Functions, and Dataflow.
  • Cloud Composer, ETL experience - working with large data sets, Py Spark, Python, Spark SQL, Data frames, and Py Test
  • Develop and implement proactive monitoring and alert mechanism for data issues.
  • Familiarity with CI/CD pipelines and automation tools such as Jenkins, GitHub & GitHub actions.
  • Able to write complex SQL queries for business results computation
  • Develop architecture recommendations based on Google Cloud Platform best practices and industry standards.
  • Work through all stages of a data solution life cycle: analyze/profile data, create conceptual, logical & physical data model designs, architect and design ETL, reporting, and analytics solutions.
  • Conduct technical reviews and ensure that Google Cloud Platform solutions meet functional and non-functional requirements.
  • Strong knowledge of Google Cloud Platform architecture and design pattern
  • Business Logic & Workload Processing Data Engineer Responsibilities
  • Developing Workloads for Business Logic Execution
    Designed and implemented scalable workloads in Google Cloud Platform (Google Cloud Platform) to process complex business rules for Rx claim pricing and drug coverage analysis.
  • Business Rule Integration
    Collaborated with stakeholders to translate regulatory and business requirements into executable rules. These rules determine drug pricing based on plan configurations, drug coverage, and pharmacy-specific factors.
  • Workload Orchestration
    Leveraged Cloud Composer, Dataflow, and Big Query to build automated, efficient workflows that:
    • Ingest and validate data
    • Apply rule-based logic at scale
    • Output regulatory-compliant pricing files
  • Dynamic Rule Processing
    Supported rule versioning and updates to ensure accurate real-time processing across large f data.
  • Optimization & Monitoring
    Tuned workload performance for cost efficiency and monitored execution through Google Cloud Platform s built-in tools, ensuring timely delivery and data quality.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About EdHike, LLC