Overview
Skills
Job Details
Title: Senior Snowflake Data Engineer
Location: Charlotte, NC (Hybrid)
Duration: C2C
Job Description:
Job Summary
The Senior Snowflake Data Engineer will be responsible for designing, building, and optimizing highly scalable and robust data pipelines within the Snowflake cloud data platform. This role requires expertise in modern data warehousing principles, advanced SQL, ETL/ELT methodologies, and proficiency in programming languages like Python or Scala to deliver high-quality, reliable data solutions for business intelligence, analytics, and machine learning initiatives.
Key Responsibilities
1. Data Pipeline Development & ETL/ELT
Design & Build: Lead the end-to-end design, development, and deployment of secure, optimized, and reliable ETL/ELT data pipelines that ingest, transform, and load large volumes of complex data into Snowflake.
Transformation Logic: Implement advanced data transformation logic using Snowflake SQL, Stored Procedures, UDFs, and tools like dbt (data build tool) for managing and deploying data models.
Integration: Integrate various data sources (e.g., streaming data, APIs, relational databases, cloud storage: S3, ADLS) using Snowflake features like Snowpipe and external stages.
Performance: Optimize data ingestion and transformation jobs for speed and cost efficiency by leveraging Snowflake features such as Virtual Warehouse sizing, clustering keys, and query optimization.
2. Data Modeling & Architecture
Modeling: Design and implement robust data models (e.g., Kimball, Inmon, Data Vault) within Snowflake to support diverse analytical and reporting needs.
Best Practices: Establish and enforce data quality, governance, and coding standards for all data engineering artifacts.
Platform Enhancement: Collaborate with Data Architects and Snowflake Administrators to evolve the core data platform architecture and introduce new features or capabilities.
3. Automation & Operational Excellence
CI/CD & DevOps: Implement Infrastructure as Code (IaC) (e.g., using Terraform) and integrate data pipeline deployments into CI/CD workflows for automation and version control.
Monitoring: Develop and maintain monitoring, logging, and alerting for data pipelines to ensure data quality, integrity, and operational health.
Troubleshooting: Diagnose and resolve complex data pipeline failures, performance bottlenecks, and data quality issues in a timely manner.
Minimum Qualifications
- 5+ years of professional experience in data engineering, ETL development, or similar roles.
- 3+ years of hands-on experience designing and building data solutions specifically on the Snowflake Cloud Data Platform.
- Expert-level proficiency in advanced SQL and experience with dimensional and relational data modeling.
- Proficiency in a programming language suitable for data engineering, such as Python (including libraries like Pandas or PySpark).
- Experience with cloud data technologies and services (AWS, Azure, or Google Cloud Platform).
- Strong understanding of data warehousing concepts and ELT/ETL principles.
Preferred Skills & Certifications
- Snowflake SnowPro Core or Advanced Certification (e.g., Data Engineer).
- Hands-on experience with dbt (data build tool) for data transformation/modeling.
- Experience with workflow orchestration tools (e.g., Apache Airflow, Azure Data Factory, or similar).
- Knowledge of big data technologies (e.g., Spark, Hadoop) is a plus.
- Experience implementing Data Governance and Role-Based Access Control (RBAC) within Snowflake.