Snowflake Data Architect Job Description ( Kafka + Google Dataflow + Tableau)
Overview
Client is seeking a highly skilled Snowflake Data Architect to design, modernize, and optimize a cloudbased enterprise data platform that supports both batch and realtime analytics. This role will architect endtoend Snowflake solutions, integrate Kafka streaming pipelines, leverage Google Dataflow for scalable data processing, and support enterprise reporting through Tableau.
Key Responsibilities
Snowflake Architecture & Data Platform Design
- Architect, design, and optimize Snowflake data warehouses, data lakes, and data marts.
- Define Snowflake best practices for compute sizing, micropartitioning, clustering, RBAC, security, and cost governance.
- Implement scalable data models (dimensional, 3NF, Data Vault) for enterprise analytics.
Streaming & RealTime Data Pipelines (Kafka)
- Design and implement realtime ingestion pipelines from Kafka, Confluent Cloud, MSK, or EventHub into Snowflake.
- Use Snowpipe, Snowpipe Streaming, Kafka Connect, or custom ingestion frameworks to deliver lowlatency data.
- Ensure schema evolution, replayability, and faulttolerance for streaming workloads.
Google Dataflow (Apache Beam)
- Build and orchestrate Dataflow pipelines for realtime and batch transformations.
- Integrate Dataflow with Kafka, Pub/Sub, Cloud Storage, and Snowflake.
- Optimize Dataflow jobs for performance, autoscaling, and cost efficiency.
- Implement Beambased transformations for cleansing, enrichment, and aggregation.
Analytics & Visualization (Tableau)
- Enable downstream analytics by designing Snowflake structures optimized for Tableau dashboards.
- Collaborate with BI teams to ensure semantic consistency, performance, and governed data access.
- Support Tableau extracts, live connections, and performance tuning.
Governance, Quality, and Operations
- Implement data quality, lineage, metadata management, and governance frameworks.
- Establish CI/CD pipelines for ELT, Dataflow, and Snowflake deployments.
- Collaborate with business units to translate requirements into scalable data architectures.
Required Skills & Experience
- 12+ years in data architecture, data engineering, or enterprise data warehousing.
- Deep handson experience with Snowflake (performance tuning, cost optimization, security).
- Strong experience with Kafka and realtime streaming ingestion patterns.
- Handson experience with Google Dataflow (Apache Beam) for batch and streaming pipelines.
- Strong SQL and ELT development skills (dbt, Matillion, ADF, Glue, or similar).
- Experience with Tableau data modeling, performance tuning, and dashboard optimization.
- Solid understanding of cloud architecture (AWS, Azure, or Google Cloud Platform).
- Experience with CI/CD, Gitbased development, and automated deployments.