Overview
Skills
Job Details
Role: Snowflake Architect (Senior / Hands-on)
Location: PST Hours - REMOTE
Duration: 6+ Months
Contract Type: [W2 / C2C as applicable]
Job Description
We are seeking an experienced Snowflake Architect to design, develop, and maintain scalable data solutions using Snowflake, DBT, and Apache Airflow. The ideal candidate will have deep technical expertise in data architecture, data modeling, and ETL/ELT pipeline development with a focus on performance, scalability, and automation.
Key Responsibilities
Design, develop, and maintain incremental data pipelines using Snowflake, DBT, and Apache Airflow.
Implement ETL/ELT workflows to process structured and semi-structured data at scale.
Build and optimize data models in Snowflake for various use cases such as analytics, BI, and reporting.
Develop modular, testable, and version-controlled transformations using DBT.
Orchestrate, schedule, and monitor workflows leveraging Apache Airflow.
Implement and maintain CI/CD practices for data pipelines and manage deployment processes.
Continuously optimize query performance and storage costs within the Snowflake environment.
Required Qualifications
10+ years of professional experience in Data Engineering or related roles.
Proven expertise in Snowflake including data warehousing, performance tuning, and data sharing.
Strong hands-on experience with DBT for data modeling, transformation, and testing.
Proficiency with Apache Airflow (or similar orchestration tools) for workflow automation.
Strong command of SQL and Python.
Experience in designing and maintaining incremental ETL/ELT pipelines.
Exposure to at least one major cloud platform (AWS, Google Cloud Platform, or Azure).
Familiarity with streaming platforms such as Kafka or equivalent for real-time data streaming.