Snowflake Architect or Lead with Cortex AI : REMOTE

Overview

Remote
$60 - $70
Contract - W2
Contract - 12 Month(s)

Skills

snowflake
Cortex

Job Details

Snowflake Architect

REMOTE

Key Responsibilities
Design and develop data models, schemas, and database objects (tables, views, materialized views) within Snowflake to support analytics and reporting.
Build, operate, and optimize ETL / ELT pipelines (ingestion, transformation, and loading) for large-scale structured / semi-structured / unstructured data.
Write and tune SQL queries, stored procedures, user-defined functions to optimize performance, cost, and scalability.
Leverage Snowflake features (virtual warehouses, clustering / micro-partitioning, time travel, zero-copy cloning, data sharing, data masking, etc.) to maximize efficiency and governance.
Implement and maintain security, access control, and data governance (roles, privileges, row-level security, masking, audit) across the Snowflake environment.
Monitor and optimize system performance (compute costs, query latency, concurrency) and usage.
Collaborate with data architects, data engineers, data scientists, and analysts to understand requirements and deliver scalable data solutions.
Migrate data from legacy/on-premises databases or other cloud platforms into Snowflake, ensuring data integrity and minimal downtime.
Document schemas, pipeline logic, mapping, and best practices.
Stay current with Snowflake s evolving features (e.g. Snowpark, new governance or performance offerings) and industry best practices, and advocate for improvements.
Provide technical mentorship or guidance (if in senior role), review designs/architecture, and help standardize practices across the team.



Skills & Qualifications / Requirements

Must-have / Core Skills
Deep expertise in SQL, with strong ability to write complex, optimized queries and troubleshoot performance issues.
Solid understanding of data warehousing concepts, dimensional modeling (star/snowflake schemas), normalization vs denormalization, fact/dimension tables.
Hands-on experience with Snowflake Data Cloud (warehouses, micro-partitions, data sharing, cloning, zero-copy, time travel, etc.).
Experience designing and operating ETL / ELT pipelines (using tools like dbt, Fivetran, Matillion, Airflow, etc.).
Familiarity with one or more cloud platforms (AWS, Azure, Google Cloud Platform) and their integration with Snowflake.
Knowledge of performance tuning, resource management, and query optimization in a cloud warehouse context.
Experience implementing data security, role-based access control, masking, and data governance.
Strong problem-solving and analytical skills.
Excellent written and verbal communication skills; ability to translate business requirements into technical solutions.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.