Snowflake Data Architect with Cortex and Machine Learning

Overview

Remote
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - 12 Month(s)

Skills

Snowflake
Data Architect
Cortex
Machine Learning

Job Details

Snowflake & Data Architect

Location: Remote

Duration: 6-12 Months

Position Overview

We are seeking a highly skilled Snowflake Architect / Data Modeler to design and implement scalable, high-performance cloud data solutions using Snowflake. This role blends strategic architecture planning with hands-on data modeling to support enterprise-level analytics, reporting, AI, and business intelligence initiatives.

The ideal candidate will bring deep experience in cloud data warehousing, requirements gathering, data modeling, and data pipeline optimization, along with a strong understanding of how to leverage emerging Snowflake-native AI and machine learning capabilities, including Snowflake Cortex.

As a technical architect, you will work closely with data engineers, analysts, and business stakeholders to build a modern, intelligent data architecture that not only meets current needs but also enables advanced AI/ML use cases powered by Snowflake s ecosystem.

Snowflake Architecture & Design

  • Design and implement Snowflake-based data solutions that align with business goals and industry best practices.
  • Architect scalable, high-performance data warehouse structures, storage strategies, and data sharing frameworks using Snowflake.
  • Define and manage virtual warehouses, clustering keys, multi-cluster compute environments, and data partitioning strategies.
  • Establish and implement best practices for data governance, security, and cost management within Snowflake environments.

Data Modeling

  • Create conceptual, logical, and physical data models to support analytics, reporting, operational needs, and AI/ML readiness, using SQLDBM or similar modeling tools.
  • Design data product models to support BI, self-service analytics, and AI-powered applications.
  • Translate business and analytical requirements into scalable, performant, and user-friendly data structures.
  • Collaborate with Data Engineering to develop detailed technical specifications based on models.
  • Partner with the Data Quality team on data profiling, lineage, and quality discovery.

ETL/ELT Integration

  • Work closely with Data Engineers to design and optimize ELT/ETL pipelines using tools like Matillion or Snowflake s native features (Streams, Tasks, UDFs, Snowpipe).
  • Ensure data pipelines can support real-time and batch AI/ML workflows, and integrate seamlessly with AI-ready datasets.

AI Enablement & Cortex Integration

  • Collaborate with data scientists and ML engineers to enable Snowflake Cortex features, such as prebuilt LLM functions, sentiment analysis, and generative AI capabilities.
  • Develop strategies to expose AI-ready datasets to machine learning models and Cortex functions within Snowflake.
  • Evaluate and implement vector embeddings, semantic search, and predictive analytics use cases using Snowflake AI/ML tooling.
  • Support deployment and monitoring of AI/ML models within the Snowflake ecosystem, including Snowpark and external ML frameworks.

Collaboration & Best Practices

  • Work cross-functionally with data engineers, analysts, product managers, architects, and business stakeholders to deliver data solutions that meet enterprise needs.
  • Serve as a technical authority on Snowflake, data modeling, and AI enablement best practices.
  • Establish, document, and enforce standards and best practices for Snowflake development, data modeling, and cloud data warehousing.

Required Qualifications

  • 10+ years of experience in data architecture, data modeling, and data warehousing/cloud data warehousing.
  • 3+ years of hands-on, production-level experience with Snowflake.
  • Familiarity with Snowflake Cortex and its integration into AI/ML workflows.
  • Experience supporting AI/ML workloads, including preparing AI-ready data models and feature engineering pipelines.
  • Exposure to vector databases, LLM-based data enrichment, or semantic search capabilities is a plus.
  • Strong command of SQL and experience with data modeling tools such as Erwin, SQLDBM, or equivalent.
  • In-depth understanding of data warehousing principles, data lake/data mesh architecture, and data product design.
  • Experience with ETL/ELT tools such as Matillion, DBT, or Informatica, and working in cloud platforms (AWS preferred).
  • Proven ability to translate business needs into scalable technical solutions in a collaborative environment.

Best Regards,

Hari Krishna

Eight One Three 435 Five Three Four Seven

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.