Lead Snowflake Data Engineer

Remote • Posted 15 hours ago • Updated 15 hours ago
Full Time
No Travel Required
Remote
Depends on Experience
Fitment

Dice Job Match Score™

⏳ Almost there, hang tight...

Job Details

Skills

  • ERP
  • AI/ML
  • Cortex
  • Multi tenency

Summary

Position: Senior Snowflake Data Engineer
Location: San Francisco, CA(initially 2-3 weeks onsite then after it will be completely remote)

Job Description-:

Note- Client is specifically looking for Snowflake Cortex AI, Multi-Tenancy and AI/ML experience 

About the Role

We are looking for a Senior Snowflake Data Engineer with deep expertise in modern data

platforms and largescale cloud data architectures. This role is part of a highvisibility initiative

to build a unified enterprise data foundation powering advanced analytics, AI/ML workloads,

and missioncritical decision systems.

You will design complex Snowflake architectures, lead data engineering best practices, mentor

engineers, and drive endtoend data platform modernization at scale.

This is a role for senior, handson engineers who excel in solving hard problems, optimizing

systems, and driving technical excellence in fastpaced environments.

 

Key Responsibilities

Architecture & System Design

• Own the endtoend architecture, design, and optimization of Snowflake environments.

• Build scalable data ingestion, transformation, and orchestration frameworks capable

of handling highvolume, highvelocity enterprise data.

• Architect complex ELT pipelines, using Snowflake Streams, Tasks, Snowpipe,

Materialized Views, and dynamic tables.

• Create performant dimensional and data vault models with strong understanding of

warehouse design principles.

Advanced Engineering & Optimization

• Lead performance tuning, including clustering, micropartition optimization, and query

acceleration strategies.

• Drive cost governance, warehouse sizing strategies, autosuspend/autoresume setups,

and resource monitoring.

• Build reusable frameworks for schema evolution, metadata management, and

automated quality checks.

• Develop CI/CD workflows for data transformations, infrastructure-as-code, and

versioned data pipelines.

 

AI/ML Data Enablement

• Partner closely with AI/ML teams to deliver featureready datasets, highthroughput

pipelines, and realtime data delivery mechanisms.

• Architect data flows to support model training, validation, batch/real-time inference,

and lineage tracking.

• Enable feature stores, embedding pipelines, and vectorized data workflows where

needed.

 

Leadership & Collaboration

• Provide technical leadership to data engineering teams, drive best practices, and guide

architectural decisions.

• Work with crossfunctional stakeholders—platform engineering, product, analytics, and

security—to build a cohesive data ecosystem.

• Lead code reviews, mentor junior engineers, and raise the overall engineering bar.

Governance, Reliability & Security

• Implement strong role-based access control, data masking, and enterprisegrade

security frameworks.

• Establish data quality SLAs: validation rules, anomaly detection, automated

reconciliation.

• Build monitoring dashboards for pipeline observability, reliability metrics, and incident

response workflows.

 

Required Qualifications

• 6–12+ years of experience in data engineering, with deep handson Snowflake

expertise.

• Expert-level proficiency in SQL, advanced query optimization, and distributed data

processing concepts.

• Strong experience with Python and building production-grade data pipelines.

• Handson experience with Airflow, dbt, Dagster, or similar orchestration/ELT tools.

• Strong understanding of cloud ecosystems (AWS/Google Cloud Platform/Azure) including IAM,

networking, object storage, and security.

• Proven track record designing enterprise-scale data architectures for complex analytics

or AI platforms.

• Experience leading engineering efforts, mentoring, and driving technical direction.

 

Preferred Qualifications

• Experience supporting AI/ML engineering workflows or building MLready data layers.

• Deep knowledge of Snowflake features such as:

o Zero-copy cloning

o Resource monitors

o Streams, Tasks, Pipes

o Time Travel & Fail-safe

• Exposure to event-driven data pipelines, Kafka, Kinesis, Pub/Sub, or similar platforms.

• Background in consulting, platform modernization, or large enterprise transformation

programs.

 

What Success Looks Like

• You design highperformance, scalable Snowflake data systems that handle complex

business & AI use cases.

• You proactively identify architectural gaps and deliver robust, forward-looking solutions.

• You mentor engineers and become a technical backbone for the data platform.

• You consistently deliver reliable, high-quality data to downstream AI, analytics, and

operational systems.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10513292
  • Position Id: 71988-12895-
  • Posted 15 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

Today

Easy Apply

Third Party, Contract

$60 - $70

Remote

Yesterday

Easy Apply

Contract, Third Party

Depends on Experience

Remote

Today

Third Party, Contract

Depends on Experience

Remote or Easton, Pennsylvania

Today

Easy Apply

Full-time

USD 100,000.00 - 120,000.00 per year

Search all similar jobs