Overview
Skills
Job Details
Overall 10+ years of experience with Data Management, Big Data, ADF, Data Warehousing and Analytics.
At least 3 to 4 years of experience in Architecting and Implementing Data Solutions using Snowflake in AWS environment.
In depth understanding of Snowflake Architecture including Snow SQL, Performance tuning, Compute and Storage
Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, Adaptive computing, query performance tuning, Zero copy clone, time travel, Cortex AI and understanding how to use these features
Expertise in deploying Snowflake features such as data sharing.
Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
Hands on working experience with DBT is must have skill.
Experience with implementation of data security, encryption, PII/PSI legislation, identity and access management across sources and environments.
Experience with data process Orchestration, end-to-end design and build process of Near-Real Time and Batch Data Pipelines
More than one year of multi-cloud experience in at least two of the three public Cloud Platforms (AWS, Azure and Google Cloud Platform)
Certification in Snowflake is preferred
Strong client-facing communication and facilitation skills.