Snowflake DBT Lead

  • Dallas, TX
  • Posted 23 hours ago | Updated 23 hours ago

Overview

Hybrid
Depends on Experience
Contract - Independent
Contract - W2
Contract - 6 Month(s)
No Travel Required
Unable to Provide Sponsorship

Skills

Amazon S3
Amazon Web Services
Clustering
ELT
Snow Flake Schema
Microsoft Azure
DBT
Snowflake

Job Details

Hi,

Hope you are doing well.

Please find the below JD.

Title: Snowflake DBT Lead

Location: Dallas TX Onsite

Type of Hire: Contract

Note: Kindly share only local profiles there might be in person interview required also client in need of only 12+ years.

Key Responsibilities

Technical

  • Design and implement modular reusable DBT models for data transformation in Snowflake
  • Optimize Snowflake performance through clustering partitioning caching and query tuning
  • Define and manage schema objects including databases schemas tables views and stages
  • Build and maintain ELT pipelines using Snowflake native features like Snow pipe Streams and Tasks
  • Integrate Snowflake with external data sources and cloud storage e.g. AWS S3 Azure Blob Google Cloud Platform
  • Optimize query performance using clustering keys result caching and materialized views
  • Monitor and tune warehouse performance and cost efficiency
  • Leverage advanced Snowflake features like Time Travel Zero Copy Cloning and Data Sharing
  • Explore and implement UDFs external functions and Snowpark where applicable
  • Ensure compliance with data governance and privacy standards
  • Automate workflows using orchestration tools e.g. Airflow Azure Data Factory
  • Schedule and monitor data jobs using Snowflake Tasks and external schedulers
  • Collaborate with data analysts architects and business stakeholders to translate requirements into scalable data solutions
  • Design and implement DBT projects from scratch including folder structure model layers staging intermediate marts and naming conventions
  • Use Git for version control of DBT projects
  • Design build and maintain modular DBT models for data transformation
  • Implement staging intermediate and mart layers following best practices
  • Use Jinja templating and macros to create reusable logic
  • Define and manage tests e.g. uniqueness not null accepted values within DBT
  • Monitor test results and resolve data quality issues proactively
  • Implement CICD pipelines for DBT projects using Git Bitbucket and Jenkins
  • Ensure data governance lineage and documentation using tools like dbt docs and metadata tagging
  • Integrate Snowflake with cloud storage e.g. Google Cloud Platform Azure Blob AWS S3 and orchestration tools e.g. Airflow Azure Data Factory
  • Troubleshoot and resolve data quality issues and performance bottlenecks
  • Implement role-based access controls and data masking where required
  • Ensure compliance with data governance and privacy policies
  • Integrate DBT with orchestration tools eg Airflow Prefect
  • Schedule and monitor DBT run in production environments
  • Functional
  • Prior experience on working with sources like SAP ECC S4 HANA
  • Functional understanding one of these SAP module Supply chain Finance FICO Sales Distribution
  • Prior experience pulling data from SAP sources

Required Skills

  • 6 years of hands-on experience with Snowflake including Snow SQL Snow pipe Streams Tasks and Time Travel
  • 2 years of hands-on experience with DBT Core or Cloud in a production environment
  • Strong SQL skills and experience with data modelling star snowflake schema normalization denormalization
  • Deep understanding of DBT features materializations table view incremental ephemeral macros seeds snapshots tests and documentation
  • Experience with cloud data warehouses Snowflake
  • Proficiency in Git CICD and workflow orchestration eg Airflow dbt Cloud
  • Familiarity with Jinja templating YAML configuration and DBT project structure
  • Strong communication skills and ability to work cross functionally
  • Preferred Qualifications
  • SnowPro Core Certification or equivalent
  • Experience with Airflow Azure Data Factory or similar orchestration tools
  • Familiarity with data cataloguing and lineage tools
  • Knowledge of data security RBAC and masking in Snowflake
  • Experience working in Agile Scrum environments

Skills

Mandatory Skills: Snowflake, ANSI-SQL, DBT

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.