Data Architect (Snowflake/Databricks)

  • Reston, VA
  • Posted 22 hours ago | Updated 22 hours ago

Overview

Hybrid
$70 - $80
Accepts corp to corp applications
Contract - Independent
Contract - 12 Month(s)

Skills

Amazon Web Services
Snow Flake Schema
SQL
Databricks

Job Details

Title: Data Architect (Snowflake/Databricks)

Location: Reston, VA/Herndon, DC (Hybrid)

Duration: Long Term

Job Summary:

We are looking for a seasoned Data Architect with deep expertise in Snowflake and/or Databricks to lead the design and implementation of scalable, high-performance cloud data platforms. You will be responsible for defining data architecture, driving data platform strategy, and ensuring best practices in data management, governance, and performance.

The ideal candidate is a hands-on technical leader with a solid understanding of cloud-native data architecture, big data processing, and modern data platforms.

Key Responsibilities:

  • Architect, design, and implement scalable cloud data solutions using Snowflake, Databricks, and associated tools.
  • Define data architecture standards, patterns, and governance frameworks.
  • Collaborate with data engineers, analysts, and stakeholders to translate business requirements into data platform designs.
  • Lead migration initiatives from legacy data warehouses to Snowflake or Databricks.
  • Develop strategies for data ingestion, transformation, governance, and lineage.
  • Optimize data pipelines for performance and cost efficiency across large datasets.
  • Ensure compliance with data security, privacy, and regulatory requirements.
  • Stay current on industry trends and recommend new technologies or enhancements.
  • Provide architectural guidance and mentorship to engineering teams.

Required Skills & Qualifications:

  • Bachelor s or Master s degree in Computer Science, Engineering, or a related field.
  • 8+ years of experience in data engineering or data architecture roles.
  • 3+ years of hands-on experience with Snowflake and/or Databricks in production environments.
  • Strong knowledge of SQL, Python, and Spark (especially PySpark).
  • Deep understanding of modern data architectures including lakehouse, data mesh, ETL/ELT, and streaming data.
  • Experience with cloud platforms (AWS, Azure, or Google Cloud Platform), preferably multi-cloud or hybrid environments.
  • Expertise in data modeling, performance tuning, and large-scale data processing.
  • Familiarity with CI/CD, DevOps, and Infrastructure as Code (IaC) for data pipelines.
  • Strong understanding of data governance frameworks and tools like Unity Catalog, Alation, or Collibra.

Preferred Qualifications:

  • Certifications in Snowflake, Databricks, or cloud platforms (AWS/Azure/Google Cloud Platform).
  • Experience with Delta Lake, MLflow, or Feature Store.
  • Exposure to BI tools (Power BI, Tableau, Looker) and how they integrate with data platforms.
  • Experience with Kafka, Airflow, dbt, or other orchestration/streaming tools.

Thanks,

Naveen S

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.