Job Description: Snowflake Architect
Role Overview
The Data Analytics and Insights team is seeking a Snowflake Architect to lead the design and implementation of scalable, secure, and high-performing data platforms on Snowflake. This role is responsible for defining the data architecture, standards, and best practices that enable enterprise analytics, reporting, and advanced data use cases.
The ideal candidate will bring deep expertise in Snowflake, cloud data architecture, and modern data engineering practices, along with the ability to guide teams and stakeholders in building a robust and future-ready data ecosystem.
Key Responsibilities
Architecture & Platform Design
- Design and implement enterprise-scale data architecture on Snowflake.
- Define best practices for:
- data modeling (dimensional, data vault, etc.)
- data partitioning and clustering
- performance optimization and cost management
- Architect end-to-end data flows from ingestion to consumption.
Data Engineering Standards & Governance
- Establish and enforce data governance, security, and compliance frameworks.
- Define standards for:
- data quality
- metadata management
- lineage and auditing
- Ensure secure data access and role-based controls within Snowflake.
Data Pipeline & Integration Design
- Architect scalable ETL/ELT pipelines using Snowflake and tools like Informatica, dbt, or custom frameworks.
- Design integrations with:
- upstream source systems
- downstream analytics and reporting platforms
- Enable real-time and batch data processing where applicable.
Performance & Optimization
- Optimize Snowflake workloads for:
- performance
- scalability
- cost efficiency
- Implement strategies for:
- workload management
- query tuning
- storage optimization
Stakeholder Collaboration
- Partner with business, analytics, and engineering teams to translate requirements into scalable architecture solutions.
- Work closely with enterprise architecture teams to ensure alignment with broader IT strategy.
Leadership & Mentorship
- Provide technical leadership and guidance to data engineers and developers.
- Review designs and ensure adherence to architectural standards.
- Drive adoption of modern data platform best practices across teams.
Minimum Qualifications
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
- 8–12+ years of experience in data engineering, data architecture, or related roles.
- Deep expertise in Snowflake architecture and implementation.
- Strong experience with GitHub, Jenkins, and cloud platforms (AWS, Azure, or Google Cloud Platform).
- Expertise in data warehousing concepts and modeling techniques.
- Hands-on experience with ETL/ELT tools (Informatica, dbt, etc.).
- Strong proficiency in SQL and performance tuning.
- Experience designing scalable, enterprise-grade data solutions.
Preferred Qualifications
- Experience with Python or other scripting languages.
- Familiarity with big data technologies (Spark, Kafka, etc.).
- Experience with data orchestration tools (Airflow, etc.).
- Exposure to BI tools (Tableau, Power BI).
- Experience in utilities or energy sector (e.g., Smart Meter, Grid data) is a plus.
Key Outcomes / Success Measures
- Scalable and performant Snowflake data platform
- High data quality and governance maturity
- Optimized cost and query performance
Strong alignment between business