Snowflake Solution Architect

Overview

On Site
Depends on Experience
Full Time

Skills

Snow Flake Schema
Generative Artificial Intelligence (AI)
Extract
Transform
Load
Python
SQL
Machine Learning (ML)
ELT
Data Quality
Data Governance
Data Engineering
Amazon Web Services
Artificial Intelligence
Cloud Computing
Database
Cortex AI
DBT

Job Details

Job Title: Snowflake Solution Architect

Location: Baltimore, MD (5 days onsite)

Type: Fulltime Position

Job Description:

Must Have Technical/Functional Skills:

  • Snowflake expertise: Warehouses, databases, roles, RBAC, SCIM, MFA.
  • Data Engineering: ELT/ETL tools (dbt, Talend), orchestration (Airflow).
  • Cloud Platforms: AWS, Azure, or Google Cloud Platform with Snowflake integration.
  • Programming: SQL, Python; familiarity with ML frameworks.
  • Security & Compliance: Data masking, encryption, audit processes
  • Strong experience with LLMs (OpenAI, Anthropic, Hugging Face, LangChain).
  • Proficiency in Python and modern AI frameworks.
  • Familiarity with vector databases, prompt engineering, and AI best practices.
  • 5+ years of product-focused engineering experience.
  • Knowledge of cloud deployment and scaling AI systems.
  • Strong SQL and Python skills.
  • Hands-on experience with dbt and Snowflake.
  • Familiarity with cloud platforms (AWS ).
  • Knowledge of CI/CD, DevOps practices, and data orchestration tools (Airflow, Prefect).
  • Ability to create lineage graphs, documentation, and validation frameworks

Must Have skills : snowflake, Cortex AI, AWS, DBT

Roles & Responsibilities

  • Design and manage data pipelines using dbt, Airflow, and CI/CD frameworks.
  • Implement Snowpipe for continuous ingestion, Streams & Tasks for real-time processing.
  • Enable AI/ML integration: Support predictive analytics and generative AI use cases.
  • Leverage Snowflake Cortex and Copilot for LLM-based applications.
  • Ensure data governance, RBAC, and security compliance across Snowflake environments.
  • Optimize performance and implement Time Travel, Zero Copy Cloning, and Secure Data Sharing
  • Build production-ready AI applications and LLM-powered features.
  • Collaborate with AI + Data teams to develop agentic AI workflows.
  • Experiment with open-source models and translate prototypes into production systems.
  • Implement RAG pipelines, fine-tuning, and observability for AI models.
  • Design and deploy secure, scalable, and highly available architectures on AWS.
  • Select appropriate AWS services for application design and deployment.
  • Implement cost-control strategies and disaster recovery plans.
  • Collaborate with teams to integrate systems and ensure compliance.
  • Develop Infrastructure as Code (IaC) using Terraform or CloudFormation
  • Design, build, and maintain data pipelines using dbt for analytics and operational use cases.
  • Implement standards for data quality, consistency, and reliability. Optimize query performance and manage compute costs.
  • Collaborate with analysts and stakeholders to understand data requirements.
  • Build automation into workflows and ensure compliance with governance policies.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.