Snowflake Architect with Strong Databricks

Louisville, KY, US • Posted 14 hours ago • Updated 14 hours ago
Contract W2
Contract Independent
No Travel Required
On-site
$60 - $65/hr
Fitment

Dice Job Match Score™

🔢 Crunching numbers...

Job Details

Skills

  • Snow Flake
  • Snow Flake Schema
  • Databricks
  • ADF
  • SQL
  • RBAC
  • Python
  • Scala
  • Java
  • Microsoft Azure
  • Apache Spark
  • Business Intelligence
  • Amazon Web Services
  • Apache HTTP Server

Summary

Role Overview

As a Data Architect, you will lead the design and implementation of a modern Lakehouse + Data Cloud ecosystem. You will be responsible for defining how data flows from raw telemetry in Databricks into highly optimized, governed presentation layers in Snowflake. Your goal is to balance the high-performance engineering capabilities of Spark/Delta Lake with the seamless, SQL-first scalability of Snowflake.

Key Responsibilities

  • Unified Architecture Design: Develop end-to-end blue-prints integrating Databricks (for ingestion, streaming, and ML) with Snowflake (for BI, reporting, and secure data sharing).
  • Data Modeling: Design sophisticated schemas using Data Vault 2.0, Dimensional Modeling, or Data Mesh principles to ensure cross-platform consistency.
  • Integration Leadership: Implement efficient data movement patterns using Snowpipe Streaming, Apache Iceberg (for interoperability), or Unity Catalog-to-Snowflake integrations.
  • Performance Optimization: Tune Databricks clusters (Photon engine, Liquid Clustering) and Snowflake warehouses (Auto-scaling, Query Acceleration) to minimize latency and cost.
  • Governance & Security: Establish a unified security posture across both platforms using RBAC, Row-Level Security, and data masking, ensuring compliance with GDPR/CCPA.
  • AI/ML Readiness: Architect data foundations that support Databricks Mosaic AI and Snowflake Cortex, enabling LLM and predictive analytics use cases.

 

Technical Requirements:

Advanced expertise in Snowpark (Python/Java), Dynamic Tables, Horizon Catalog, and Zero-copy cloning.

Deep knowledge of Delta Lake, Spark Structured Streaming, Delta Live Tables (DLT), and Unity Catalog.

Mastery of SQL and Python; familiarity with Scala or Java is a plus.

Hands-on experience with Apache Iceberg and external tables to bridge the two ecosystems.

Experience with Airflow, dbt (Cloud/Core), or Dagster for multi-platform pipelines.

Proficiency in at least one major provider (AWS, Azure, or Google Cloud Platform).

Preferred Certifications

Snowflake: SnowPro Advanced: Architect (ARA-C01).

Databricks: Databricks Certified Solutions Architect or Data Engineer Professional.

Cloud: AWS Certified Data Engineer or Azure Solutions Architect.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91163556
  • Position Id: 8923501
  • Posted 14 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Louisville, Kentucky

8d ago

Easy Apply

Full-time

Depends on Experience

Remote

2d ago

Easy Apply

Contract

Depends on Experience

Remote

21d ago

Easy Apply

Contract

85 - 100

Remote

26d ago

Easy Apply

Contract

$70 - $80

Search all similar jobs