Snowflake Architect

Overview

Remote
$60 - $70
Contract - Independent
Contract - W2

Skills

Snowflake
Architect
Airflow
DataOps
ETL
AWS

Job Details

Title: Snowflake Architect

Location: Remote

Contract role

Need strong with Architect level exp

Looking for strong in Airflow and snowflake

Job Description:

  • We are seeking a highly skilled and experienced Data Engineering/DevOps Lead with a strong focus on Snowflake. The ideal candidate should possess hands-on expertise in architecting, and implementing data and dataops solutions for cloud-based data platforms leveraging your expertise in AWS/Azure, Snowflake, ETL tools and data warehousing.

Key Responsibilities:

  • Lead the design and implementation of scalable, robust, and high-performance data pipelines, ETL processes, and analytics solutions using AWS services, Snowflake, and other Big Data technologies.
  • Design and implement complex Data Warehousing/Data Lake solutions on Snowflake.
  • Design and implement dataops processes for Snowflake including automation, CICD and Orchestration.
  • Troubleshooting, performance tuning complex SQL queries on Snowflake along with deep understanding of cost optimization strategies in Snowflake.
  • Collaborate with business stakeholders, data scientists, and other cross-functional teams to understand their requirements and translate them into innovative data solutions.
  • Provide leadership in data engineering initiatives, guiding the team in best practices, performance optimization, and ensuring the successful delivery of high-quality data solutions.

Desirable skills:

  • At least 14+ years of experience in Data engineering, DataOps preferably in a senior or lead capacity.
  • Proficiency with data platform architecture, design, data dictionaries, multi-dimensional models, objects, star and snowflake schemas as well as structures for data lakes and data warehouses using Snowflake.
  • Experience implementing core Snowflake concepts such as data sharing, UDFs, zero copy clones, time travel, micro partition, stored procedures, data import/export, external tables focus on snowflake.
  • Strong expertise in designing and building scalable data pipelines, ETL processes, and analytics solutions using AWS services (e.g., S3, Glue, Lambda) , ETL tools (e.g., Qlik, Firetran, Airflow, DBT) and Snowflake.
  • Experience in advanced Snowflake concepts, including RBAC, SnowPark, and architecting and setting up the Snowflake platform.
  • Expertise in implementing Continuous Integration and Continuous Deployment (CICD) processes specifically tailored for Snowflake, ensuring efficient and automated deployment workflows.
  • Exceptional communication and collaboration skills, with the ability to work effectively with senior stakeholders and cross-functional teams.
  • Good understanding of Cloud based data solution components and architecture covering data ingestion, data processing, data cataloging, security, devops, consumption, etc

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About S Linx LLC