Overview
Skills
Job Details
As a Data Engineer, you will design, build, and optimize scalable data pipelines and architecture. Your deep expertise in Snowflake, dbt, and Fivetran will be critical in transforming raw data into actionable insights that empower business decisions.
Key Responsibilities
Develop, maintain, and optimize ETL/ELT pipelines using Fivetran for robust data ingestion from multiple sources.
Design and implement scalable and efficient data models and transformations in Snowflake.
Build data transformation workflows leveraging dbt to ensure data quality, consistency, and documentation across datasets.
Collaborate closely with data analysts, scientists, and business stakeholders to understand data requirements and deliver timely solutions.
Monitor data pipeline performance and troubleshoot issues to ensure reliability and accuracy.
Implement best practices for data governance, security, and compliance within Snowflake environments.
Required Skills and Experience
Strong hands-on experience with Snowflake, including performance tuning, clustering, and query optimization.
Proficiency in dbt for data modeling, transformation, and testing within a modern data stack.
Experience designing and managing data pipelines with Fivetran or similar ELT platforms.
Solid SQL skills and understanding of data warehousing concepts.
Familiarity with cloud platforms (AWS, Azure, or Google Cloud Platform) and their integration with Snowflake.
Knowledge of version control (e.g., Git) and CI/CD pipelines for dbt deployments.
Ability to work in an agile, collaborative environment and communicate technical concepts clearly.