Overview
Skills
Job Details
We are seeking a highly skilled Snowflake Data Engineer / Analyst with expertise in Python, ETL pipelines, Data Engineering, and Data Analytics. The ideal candidate will be responsible for designing, developing, and maintaining scalable data solutions on the Snowflake platform while integrating data from multiple sources to support analytics and business intelligence.
Key Responsibilities
Design, develop, and optimize data pipelines and ETL workflows using Snowflake, Python, and modern data engineering tools.
Build, manage, and optimize Snowflake schemas, warehouses, and databases for high-performance analytics.
Develop data ingestion processes from structured and unstructured sources (APIs, RDBMS, cloud storage, streaming data).
Implement data transformation frameworks leveraging Python, SQL, and ETL tools (Informatica, Talend, dbt, or similar).
Collaborate with business stakeholders and analysts to deliver data models, dashboards, and reports that provide actionable insights.
Monitor and fine-tune Snowflake environments for performance, cost optimization, and security compliance.
Develop and enforce data governance, quality, and validation frameworks.
Work with cross-functional teams to design scalable data architecture solutions that meet analytics and reporting needs.
Required Skills & Qualifications
Hands-on experience with Snowflake (warehouses, schemas, stages, streams, tasks, and Snowpipe).
Strong Python programming skills for data manipulation, automation, and integration.
Expertise in ETL/ELT pipeline design and orchestration (Airflow, dbt, Informatica, or equivalent).
Proficiency in SQL (advanced queries, performance tuning, stored procedures).
Solid understanding of data modeling (star schema, snowflake schema, OLAP/OLTP).
Experience in data engineering workflows (data ingestion, transformation, cleansing, enrichment).
Knowledge of analytics and BI tools (Tableau, Power BI, Looker, or similar).
Familiarity with cloud platforms (AWS, Azure, Google Cloud Platform) and storage solutions (S3, ADLS, GCS).
Strong problem-solving, analytical, and communication skills.
Preferred Skills
Experience with real-time data streaming tools (Kafka, Kinesis, Pub/Sub).
Knowledge of CI/CD pipelines for data engineering.
Exposure to machine learning workflows using Python.
Understanding of data security, access control, and compliance frameworks.