Overview
Skills
Job Details
Position: Senior Google Cloud Platform Data Engineer
Location: New York - ONSITE
We are seeking an experienced Senior Data Engineer with deep expertise in Google Cloud Platform (Google Cloud Platform) to join our Data Engineering team. In this role, you will design, build, and optimize scalable data pipelines, data lakes, and data warehouses on Google Cloud Platform to support advanced analytics, machine learning, and business intelligence initiatives across the organization.
This is a high-impact role for someone who thrives in designing end-to-end data architectures, optimizing performance, and enabling data-driven decision-making at scale.
Key Responsibilities
Data Engineering & Architecture
Design and implement scalable and secure data pipelines using Google Cloud Platform-native services (e.g., BigQuery, Dataflow, Pub/Sub, Cloud Functions, Cloud Composer).
Develop ETL/ELT processes to ingest data from diverse sources including APIs, flat files, databases, and streaming sources.
Build and maintain data lakes and data warehouses on Google Cloud Platform, ensuring schema optimization, partitioning, and cost-efficiency.
Create and manage batch and real-time data workflows using Cloud Composer (Apache Airflow) and Dataflow (Apache Beam).
Data Management & Governance
Ensure data quality, integrity, and reliability through robust validation and monitoring frameworks.
Implement best practices for data lineage, metadata management, and data cataloging using tools like Data Catalog and Looker.
Collaborate with data governance, security, and compliance teams to enforce data access controls and encryption.
Collaboration & Leadership
Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver high-quality solutions.
Provide technical mentorship and code reviews for junior data engineers.
Contribute to architecture reviews and technology evaluations for continuous improvement of the data platform.
Minimum Qualifications
Bachelor's or Master s degree in Computer Science, Engineering, Information Systems, or a related field.
5+ years of experience in data engineering, with at least 2+ years working extensively on Google Cloud Platform.
Strong proficiency with SQL, Python, and/or Java/Scala for data processing and scripting.
Proven experience with Google Cloud Platform services such as:
BigQuery
Dataflow / Apache Beam
Pub/Sub
Cloud Storage
Cloud Composer (Airflow)
Cloud Functions
Experience in building large-scale ETL/ELT pipelines and optimizing query performance on cloud platforms.
Solid understanding of data modeling, partitioning, and schema design for analytical workloads.
Familiarity with CI/CD practices, Terraform, and version control (e.g., Git).
Preferred Qualifications
Google Cloud Platform Professional Data Engineer Certification or equivalent.
Experience with Looker, dbt (data build tool), or similar data modeling tools.
Prior work with streaming data architectures and technologies such as Kafka, Spark Streaming, or Flink.
Experience in data privacy, GDPR/CCPA compliance, and implementing data security at rest and in transit.
Exposure to DevOps and container orchestration tools (e.g., Kubernetes, Cloud Run).
Strong analytical and problem-solving skills with attention to performance and scalability.
Soft Skills
Excellent communication and stakeholder engagement skills.
Proven ability to work independently and lead complex projects end-to-end.
Collaborative mindset with a drive to mentor, document, and share knowledge.