Snowflake Data Architect / Solutions Architect

Overview

Remote
$100,000 - $160,000
Full Time

Skills

Payer
Snowflake
data warehouse
Data Architecture

Job Details

Hi All,

At Hexaware Technologies, we are a leading global IT Services company, dedicated to driving digital transformation and innovation for businesses around the world. Founded in 1990, Hexaware has grown into a global trusted partner for enterprises, offering comprehensive AI empowered services including IT Consulting, Application Development, Infrastructure and Cloud Management and Business Process services.

At Hexaware we are a community of creative, diverse, and open-minded Hexawarians creating smiles through the power of great people and technology.

Job Description:

  • Strategic planning and hands-on engineering of Snowflake/Big Data and cloud environments that supports our clients advanced analytics and data science initiatives.
  • Provide support in defining the scope and sizing of work
  • Working closely with various enterprise architects Information security teams, Data management team, to ensure the architected solution meets all the needs of a customer, from a functionality perspective and IT solution engineering perspective.
  • Lead designing all aspects of our data solution including artifact creation such as diagrams, playbooks and other technical documents.
  • Translate business requirements into technology solutions
  • Mentor and guide Jr. team members to deliver the solutions on time.
  • Create various architecture blueprints and work with the development team to deliver the vision.

Skills:

  • Overall 10+ years of experience with Data Management, Big Data, Data Warehousing and Analytics.
  • At least 3 to 4 years of experience in Architecting and Implementing Data Solutions using Snowflake in AWS environment.
  • In depth understanding of Snowflake Architecture including Snow SQL, Performance tuning, Compute and Storage
  • Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
  • Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, Adaptive computing, query performance tuning, Zero copy clone, time travel, Cortex AI and understanding how to use these features
  • Expertise in deploying Snowflake features such as data sharing.
  • Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
  • Hands on working experience with scripting languages such as Java, Scala, Python, or Shell Scripting, etc.
  • In-depth understanding of various storage services offered by AWS.
  • Experience with implementation of data security, encryption, PII/PSI legislation, identity and access management across sources and environments.

  • Experience with data process Orchestration, end-to-end design and build process of Near-Real Time and Batch Data Pipelines
  • More than one year of multi-cloud experience in at least two of the three public Cloud Platforms (AWS, Azure and Google Cloud Platform)
  • Certification in Snowflake is preferred
  • Strong client-facing communication and facilitation skills.
  • Two to three years of prior experience in Healthcare and Payer Domain is must.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.