Snowflake Data Engineering

Overview

Hybrid
Depends on Experience
Full Time

Skills

Amazon Web Services
Agile
Data Engineering
Cloud Computing
Data Integration
Snowflake ECO
PySpark
Python
SQL
JavaScript
Data Modeling

Job Details

Job Title: Snowflake Data Engineering with AWS, Python and PySpark
Location: Frisco TX (3 days office)

Duration: 12 months


Required Skills & Experience:

  • 10+ years of experience in data engineering and data integration roles.
    • Experts working with snowflake ecosystem integrated with AWS services & PySpark.
  • 8+ years of Core Data engineering skills Handson on experience with Snowflake ecosystem + AWS experience, Core SQL, Snowflake, Python Programming.
  • 5+ years Handson experience in building new data pipeline frameworks with AWS, Snowflake, Python and able to explore new ingestion frame works.
  • Handson with Snowflake architecture, Virtual Warehouses, Storage, and Caching, Snow pipe, Streams, Tasks, and Stages.
  • Experience with cloud platforms (AWS, Azure, or Google Cloud Platform) and integration with Snowflake.
  • Snowflake SQL and Stored Procedures (JavaScript or Python-based).
  • Proficient in Python for data ingestion, transformation, and automation.
  • Solid understanding of data warehousing concepts (ETL, ELT, data modeling, star/snowflake schema).
  • Hands-on with orchestration tools (Airflow, dbt, Azure Data Factory, or similar).
  • Proficiency in SQL and performance tuning.
  • Familiar with Git-based version control, CI/CD pipelines, and DevOps best practices.
  • Strong communication skills and ability to collaborate in agile teams.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About IFLOWSOFT Solutions Inc.