Overview
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 12
Skills
data vault
Job Details
Data Lead (Data Vault 2.0)
Remote
Remote
As a Sr Data Engineer, you will be responsible for the development and validation of mostly structured data, with a focus on the enhancement and ongoing support of a complex data platform built upon a Data Vault 2.0 modeled data warehouse. You will be conducting analysis, generating data mapping documents, developing new ELT, generating sophisticated data transformations, assuring data quality, and generating usage guidelines for the Growth Strategy & Operations BIGW application. You will be responsible for partnering with members of our extended team to understand the business processes underpinning BIGW data representation.
Primary Responsibilities:
This internal facing position will work with other developers, data architects, and business partners to help build and support the Next Generation of the BIGW solution. Built on Snowflake, it is designed to provide the performance and agility demanded by an ever-changing business.
Requires the ability to work independently in a collaborative environment. As a data warehouse engineer, your role includes data analysis, data modeling, data validation, system configuration, and technical writing.
Work in a matrix environment by partnering with other teams in the organization to deliver data and reporting needs to support the expansion of the platform through new business and migration.
Work in an Agile framework within a matrix environment, working in sprints and utilizing agile tools (e.g., Rally).
Build, maintain and/or adhere to a structured data governance process to be used across all datasets with a focus on quality and accuracy.
Identify and drive the resolution of data integrity issues and organizational challenges.
Provide operational support as needed to ensure data availability SLAs are met.
Functional Competencies:
Requires a solid understanding of Data Vault 2.0 concepts and practices.
Ability to develop complex ELT transformations in dbt Cloud.
Have a proven record of solutioning in a multi-environment platform, and managing the development lifecycle demands including code vaulting, content promotion, CI/CD and job orchestration.
Manage and protect data, adhering to applicable legal/regulatory requirements (e.g., HIPAA, PHI, PII, DOI, state and federal regulations).
Thoroughly validate all your work and be able to design complex testing scenarios.
Manage/monitor ETL/ELT jobs associated with both batch and near real time processes. This will require problem solving and on-call support as necessary.
Be able to monitor and make configuration recommendations to balance Snowflake's performance/cost model.
Create and/or update user documents and materials such as Visio ER diagrams and technical abstracts.
Demonstrate understanding of the difference between business requirements and technical solutions.
Required Qualifications:
4+ years of experience with Data Modeling, ELT/ETL construction with advanced job scheduling using dbt Cloud
8+ years of experience with relational databases, database structures and design, systems design, data management, data warehouse (including Data Vault 2.0)
5+ years of in-depth experience with Snowflake
8+ years of SQL/TSQL Development experience
8+ years of experience performing significant data analysis
2+ years of experience in an Azure environment
2+ years of experience with Python
2+ years of experience with Git
Expert level of programming and troubleshooting knowledge
Advanced proficiency in Microsoft Excel, Word, and Visio
4+ years of experience with Data Modeling, ELT/ETL construction with advanced job scheduling using dbt Cloud
8+ years of experience with relational databases, database structures and design, systems design, data management, data warehouse (including Data Vault 2.0)
5+ years of in-depth experience with Snowflake
8+ years of SQL/TSQL Development experience
8+ years of experience performing significant data analysis
2+ years of experience in an Azure environment
2+ years of experience with Python
2+ years of experience with Git
Expert level of programming and troubleshooting knowledge
Advanced proficiency in Microsoft Excel, Word, and Visio
Preferred Qualifications:
Experience with Snowpark
Experience administering and supporting an Azure environment (network, firewall, PrivateLink, subscriptions and etc.)
Experience administering and supporting an Azure Databricks workspace with Unity Catalog connections
Experience with Azure Data Factory
Experience with SSIS
Experience with Snowpark
Experience administering and supporting an Azure environment (network, firewall, PrivateLink, subscriptions and etc.)
Experience administering and supporting an Azure Databricks workspace with Unity Catalog connections
Experience with Azure Data Factory
Experience with SSIS
Data Lead (Data Vault 2.0)1data vaultN/AC2C,W-2,1099,C2H,Part Time,Full Time,Other,Intern,Pass Through,Contract,SOLUTIONS,Hourly,Contract to Perm,W2United States
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.