Snowflake Data Engineer

  • San Jose, CA
  • Posted 23 hours ago | Updated 21 hours ago

Overview

Hybrid
Depends on Experience
Contract - W2
Contract - 12 Month(s)

Skills

Data Engineering
SQL
Python
Snowflake architecture
ETL/ELT
DBT
AWS / Azure / GCP

Job Details

Key Responsibilities:
Design and implement Snowflake architecture, including schema design, performance tuning, and data governance.
Develop and maintain scalable ETL/ELT pipelines for structured and unstructured data.
Write complex SQL queries for data extraction, transformation, and optimization.
Utilize Python for data processing, automation, and integration workflows.
Collaborate with data analysts, data scientists, and business stakeholders to deliver high-quality data solutions.
Monitor, troubleshoot, and optimize data systems for performance and reliability.
Ensure data security, quality, and compliance across all environments.

Qualifications & Skills:
Bachelor's degree in computer science, Data Engineering, Information Systems, or related field.
Proven experience as a Data Engineer or similar role.
Strong hands-on expertise in Snowflake architecture (data modeling, performance optimization, security, and scaling).
Proficiency in SQL and query performance tuning.
Experience with Python for data transformation and pipeline development.
Familiarity with cloud platforms (AWS / Azure / Google Cloud Platform) for data engineering solutions.
Strong problem-solving and debugging skills.
Excellent communication and collaboration abilities.

Nice to Have:
Experience with orchestration tools (Airflow, dbt, Prefect, etc.).
Knowledge of APIs, REST services, and real-time data processing.
Exposure to CI/CD practices for data pipelines.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.