Overview
On Site
Depends on Experience
Full Time
Skills
Snowflake
sql
Amazon Web Services
AWS
Job Details
Job Description
Design, build and enhance Cloud Data workflows/pipelines to process billions of records in large-scale data environments with experience in end-to-end design and development of near-real time and batch data pipelines.
Design and build Data Lake and data warehouse solutions on Snowflake and AWS using streaming and batch processes.
Develop test strategies, software testing frameworks, and test automation.
Champion a modern data engineering culture and best practices-based software development
Leverage DevSecOps techniques and have working experience with modern tools such as GitLab, Jira and build automation.
Engage in application design and data modeling discussions; participate in developing and enforcing data security policies.
Drive delivery efficiency with automation and reusable components/solutions
Design and build Data Lake and data warehouse solutions on Snowflake and AWS using streaming and batch processes.
Develop test strategies, software testing frameworks, and test automation.
Champion a modern data engineering culture and best practices-based software development
Leverage DevSecOps techniques and have working experience with modern tools such as GitLab, Jira and build automation.
Engage in application design and data modeling discussions; participate in developing and enforcing data security policies.
Drive delivery efficiency with automation and reusable components/solutions
Minimum 3 years of experience in field of data engineering involving building data integration solutions or data warehouse environments such as Snowflake, Oracle, etc.
Minimum a year of working experience in DBT, Gitlab and AWS services such as S3, AWS CLI
Knowledge about RDBMS, Data warehouse, SQL and ETL
Hands-on experience in data transformations, cleansing, de-duplication and ETL batch loads
Understanding in Snowflake architecture, role-based access ecosystem and best practices for data loading and optimization
Experience in Snowflake features including Data loading/unloading processes, different Snowflake Table Structures and Types, Stored Procedures
Ability to independently identify and resolve issues across various platforms involved in the data flow.
Experience in developing data pipelines for both Cloud and Hybrid Cloud platforms.
Knowledge of python or other scripting language is a plus.
Experience in an Agile delivery environment.
Minimum a year of working experience in DBT, Gitlab and AWS services such as S3, AWS CLI
Knowledge about RDBMS, Data warehouse, SQL and ETL
Hands-on experience in data transformations, cleansing, de-duplication and ETL batch loads
Understanding in Snowflake architecture, role-based access ecosystem and best practices for data loading and optimization
Experience in Snowflake features including Data loading/unloading processes, different Snowflake Table Structures and Types, Stored Procedures
Ability to independently identify and resolve issues across various platforms involved in the data flow.
Experience in developing data pipelines for both Cloud and Hybrid Cloud platforms.
Knowledge of python or other scripting language is a plus.
Experience in an Agile delivery environment.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.