Snowflake developer (onsite)

Overview

On Site
Depends on Experience
Contract - Independent
Contract - W2
Contract - 6 Month(s)

Skills

Amazon Kinesis
Amazon Web Services
Analytical Skill
Analytics
Apache Kafka
Apache Spark
Business Intelligence
Cloud Computing
Collaboration
Communication
Conflict Resolution
Data Engineering
Data Loading
Data Processing
Data Quality
Data Warehouse
ELT
Extract
Transform
Load
Good Clinical Practice
Google Cloud Platform
Informatica
JavaScript
Management
Microsoft Azure
Modeling
Optimization
Oracle
Orchestration
Performance Tuning
Problem Solving
Python
Real-time
SQL
Scripting
Snow Flake Schema
Soft Skills
Storage
Stored Procedures
Streaming
Talend
Teradata
Workflow

Job Details

Role : Snowflake developer
Location : RTP, NC /San Jose, CA

Key Responsibilities:
Design, develop, and maintain data pipelines and ELT workflows in Snowflake using DBT, Python, and other tools.
Implement and optimize data models, Snowflake schemas, and SQL transformations for scalable analytics solutions.
Develop and manage user-defined functions (UDFs), stored procedures, and SnowSQL scripts for automation and advanced data processing.
Integrate data from multiple sources (e.g., Oracle, Teradata, APIs, streaming platforms) into Snowflake with high performance and reliability.
Optimize query performance, storage usage, and compute resources in Snowflake.
Implement data quality, monitoring, and governance best practices.
Collaborate with cross-functional teams including Data Analysts, Architects, and BI developers to deliver robust end-to-end data solutions.
Required Skills:
4 10 years of overall experience in Data Engineering.
Strong hands-on experience in Snowflake including data load, schema design, performance tuning, and SnowSQL scripting.
Strong programming experience in Python for data ingestion, transformation, and automation.
Proficiency in DBT (Data Build Tool) for modeling, transformations, and workflow orchestration.
Excellent command over SQL (complex queries, optimization, window functions, etc.).
Experience working with large-scale data sets and performance optimization techniques.
Good to Have:
Exposure to real-time data ingestion frameworks (Kafka, Kinesis, Spark Streaming, etc.) and streaming analytics.
Experience with ETL/ELT tools (Informatica, Talend, Airflow, etc.).
Understanding of data warehousing best practices and cloud platforms (AWS, Azure, Google Cloud Platform).
Knowledge of JavaScript for Snowflake stored procedures.
Soft Skills:
Strong problem-solving and analytical mindset.
Excellent communication and collaboration skills.
Ability to work in a fast-paced, customer-focused environment.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About American IT Systems