Overview
Skills
Job Details
Key Responsibilities:
Design and implement Snowflake architecture, including schema design, performance tuning, and data governance.
Develop and maintain scalable ETL/ELT pipelines for structured and unstructured data.
Write complex SQL queries for data extraction, transformation, and optimization.
Utilize Python for data processing, automation, and integration workflows.
Collaborate with data analysts, data scientists, and business stakeholders to deliver high-quality data solutions.
Monitor, troubleshoot, and optimize data systems for performance and reliability.
Ensure data security, quality, and compliance across all environments.
Qualifications & Skills:
Bachelor's degree in computer science, Data Engineering, Information Systems, or related field.
Proven experience as a Data Engineer or similar role.
Strong hands-on expertise in Snowflake architecture (data modeling, performance optimization, security, and scaling).
Proficiency in SQL and query performance tuning.
Experience with Python for data transformation and pipeline development.
Familiarity with cloud platforms (AWS / Azure / Google Cloud Platform) for data engineering solutions.
Strong problem-solving and debugging skills.
Excellent communication and collaboration abilities.
Nice to Have:
Experience with orchestration tools (Airflow, dbt, Prefect, etc.).
Knowledge of APIs, REST services, and real-time data processing.
Exposure to CI/CD practices for data pipelines.