Data Architect

Overview

Hybrid
Depends on Experience
Full Time

Skills

Big Query
GCP
Data Architect

Job Details

Data Architect with BigQuery Summary
A skilled Data Architect with strong expertise in designing, building, and optimizing large-scale data platforms using Google BigQuery and the broader Google Cloud Platform (Google Cloud Platform) ecosystem. Experienced in developing end-to-end data architectures, data pipelines, and analytical solutions to support enterprise reporting, advanced analytics, and real-time data processing needs.
Responsibilities
  • Architect and implement data warehouse and data lake solutions using BigQuery.
  • Design scalable ETL/ELT pipelines using tools like Dataflow, Dataproc, Cloud Composer, Pub/Sub, and Cloud Storage.
  • Define data modeling frameworks (Star, Snowflake, Data Vault) for analytical workloads.
  • Optimize BigQuery performance including partitioning, clustering, materialized views, and cost-efficient query patterns.
  • Establish data governance, metadata management, data security, and compliance frameworks.
  • Collaborate with data engineering, analytics, and business teams to translate requirements into scalable data architectures.
  • Lead cloud migration from on-prem systems to BigQuery-based architectures.
  • Implement CI/CD for data pipelines and modern DevOps/DataOps practices.
Skills
  • Google Cloud Platform: BigQuery, Cloud Storage, Dataflow, Pub/Sub, Cloud Composer, Dataform
  • Programming: SQL, Python, Java (optional)
  • Tools: Airflow, dbt, Terraform, Looker, Tableau, Power BI
  • Concepts: Data Modeling, Data Quality, Data Governance, MDM, Streaming Data, APIs
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.