Overview
Remote
Up to $120,000
Full Time
Skills
GCP
Job Details
Job Description:
Senior Data Platform Engineer (Google Cloud Platform Focus)
Key Responsibilities
- Data Pipeline Automation & Engineering
- Design, build, and automate scalable, production-grade data pipelines on Google Cloud Platform using core services such as Cloud Composer (Airflow), Dataflow (Apache Beam), and BigQuery.
- Develop and implement continuous integration and continuous deployment (CI/CD) workflows for data processing and analytics pipelines using tools like Cloud Build, GitHub Actions, or Jenkins.
- Implement reusable data frameworks, templates, and libraries to standardize pipeline deployments, configuration management, and promote "Infrastructure as Code" principles.
- Orchestrate complex, multi-stage ETL/ELT pipelines across diverse data sources and environments, ensuring efficient resource utilization and low latency.
Data Quality, Testing & Reliability
- Implement and manage automated data validation, schema checks, and anomaly detection using industry-leading tools like Great Expectations, dbt tests, or custom Python frameworks.
- Integrate quality gates directly into CI/CD workflows to ensure early issue detection and continuously improve overall data reliability (Data Reliability Engineering - DRE principles).
- Schedule, monitor, and optimize data workflows, ensuring strict adherence to data delivery SLAs.
Monitoring & Observability
- Set up and maintain proactive monitoring, logging, and automated alerting for all data pipelines and platform components using Stackdriver (Cloud Monitoring & Logging), Prometheus, or Grafana.
- Develop and maintain comprehensive dashboards to track critical metrics, including data health, SLA adherence, and pipeline operational performance.
Data Governance & Metadata Management
- Integrate and manage data assets, schemas, and metadata within Google Data Catalog or equivalent metadata management platforms.
- Enforce robust governance policies, including data lineage tracking, strict access control (IAM), and compliance standards for sensitive data.
Required Skills & Experience
- 5+ years of professional experience in data engineering, data platform operations, or a similar cloud-native technical role.
- Strong expertise in the Google Cloud Platform (Google Cloud Platform) data stack: BigQuery, Dataflow, Cloud Composer, Pub/Sub, Cloud Functions, and Cloud Build.
- High proficiency in Python, SQL, and general automation scripting.
- Hands-on experience with CI/CD principles and tools, including GitOps and Infrastructure as Code (IaC) using Terraform or Cloud Deployment Manager.
- Proven experience with data quality and testing frameworks such as Great Expectations, dbt, or PyTest.
- Working knowledge of observability, logging, and monitoring frameworks for high-volume data systems.
- Familiarity with metadata management, data lineage tools, and establishing data governance policies.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.