Overview
Skills
Job Details
REMOTE ONLY
OPEN TO CANADIAN PERMENENT RESIDENTS AND CANADIAN CITIZENS ONLY
Snowflake Architect
Calgary
Years of experience needed 12-5 years
Technical Skills:
Design and implement scalable data architectures using Snowflake.
Develop data models, ETL processes, and data pipelines to ensure efficient data flow and integration.
Optimize data storage, retrieval, and processing to improve performance and reduce costs.
Collaborate with cross-functional teams to gather requirements and deliver data solutions that meet business needs.
Ensure data quality, consistency, and security across all data platforms.
Stay up-to-date with the latest industry trends and best practices in data architecture and Snowflake technologies
Competence in Snowflake data engineering components such as Snowpipe, Tasks, UDF s and Dynamic Tables.
Good knowledge of databases, stored procedures, optimizations of huge data
In-depth knowledge of ingestion techniques, data cleaning, de-dupe, partitioning.
Knowledge of multi-layer data architectures (e.g. bronze, silver, gold)
Experience in streaming / real-time / event driven data platforms.
Experience with building the infrastructure required for data ingestion and analytics.
Ability to fine tune queries.
Solid understanding of normalization and denormalization of data, database exception handling, transactions, profiling queries, performance counters, debugging, database & query optimization techniques
Familiarity with SQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE), Create and maintain optimal data pipelines, using the most appropriate features within Snowflake .
Implement data pipelines utilizing industry standards around IaC, CICD and automated testing.
Collaborate with stakeholders to understand data & non-functional requirements.
Assemble large, complex data sets that meet functional as well as non-functional business requirements.
Collaborate with the team on building data models and schema design.
Implement data quality checks and data governance standards.
Build the infrastructure required for optimal extraction, transformation, and loading of data.
Consult with product, engineering, and business stakeholders to understand business problems and identify the best way to deliver a solution educating stakeholders about options where required.
Prepare high-level ETL mapping specifications.
Test data pipelines and perform bug fixes.
Develop best practices for database design and development activities.
Actively participate in agile and design meetings to drive the technical outcomes.
Certifications Needed:
Any relevant certification would be added advantage