Lead Google Cloud Platform Data Engineer

Overview

On Site
$65 - $67
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

gcp
bigquery
etl
dataflow

Job Details

Location: Iselin NJ or Dallas, TX (Onsite)

A lead data engineer on Google Cloud Platform (Google Cloud Platform) for financial services should design, enable, and secure a modern data lake that supports advanced AI/ML workloads and regulatory reporting. This role requires expertise in scalable architecture, deep data governance, and the ability to operationalize data for analytics and compliance.

Key Responsibilities:

  • Architect and build high-performance, compliant data pipeline frameworks that support batch, streaming, and real-time use cases on a well-governed financial data lake using Google Cloud Platform services such as Cloud Storage, BigQuery, Dataplex, and Data Catalog.
  • Define and maintain data modeling and metadata standards to ensure traceability and auditability.
  • Implement data lineage, cataloging, and observability frameworks aligned with financial data governance mandates.
  • Build robust ETL/ELT pipelines in BigQuery, Dataflow, and Dataproc, with orchestration through Airflow or Cloud Composer.
  • Enable real-time analytics and event-driven data integration using Pub/Sub, Kafka, or similar technologies.
  • Implement data quality validation and anomaly detection frameworks.
  • Develop and optimize BigQuery schemas, partitioning, and clustering strategies to manage large-scale financial datalakes and warehouses.
  • Implement automated data lineage and auditability for regulatory transparency with services like Dataplex and Data Fusion Lineage.
  • Operationalize AI/ML by supporting real-time and batch data pipelines, leveraging BigQuery ML, Dataflow, and Vertex AI for advanced analytics, risk scoring, and fraud detection.

Qualifications

  • Bachelor s or master s degree in computer science, Information Systems, or a related field.
  • 8+ years of data engineering experience (at least 2 years in financial services or regulated environments).
  • Expert knowledge of BigQuery, Cloud Storage, Dataflow, and Pub/Sub on Google Cloud Platform.
  • Strong experience with SQL, Spark, Python, PySpark and data orchestration tools (Airflow, Cloud Composer, Dagster).
  • Demonstrated success designing modular, reusable data frameworks and libraries.
  • Deep understanding of data lake and warehousing concepts, ELT/ETL patterns, and financial data modeling.
  • Familiarity with security standards and compliance frameworks (e.g., PCI DSS, GDPR, SOC 2, ISO 27001).
  • Experience with real-time financial data streams (trading, payments, or fraud detection).
  • Familiarity with risk and regulatory reporting systems (Basel II/III, MiFID II, CCAR, IFRS).
  • Exposure to data science enablement, ML Ops/Data Ops. Example: building feature pipelines for credit risk, fraud, or customer analytics.
  • Experience with dbt, Looker, or Looker Studio for semantic modeling and reporting
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Rivago infotech inc