Overview
Remote
$70 - $80
Contract - W2
Contract - 6 Month(s)
Skills
Snowflake
DBT
ETL
Epic
Clinical Data
Azure Data Factory
SQL
Snowpipe
Job Details
Position: Principle Data Engineer
Remote
6 months + extensions.
Project -
Clinical Data Analytics Team Overview
- The only team responsible for end-to-end development within the organization.
- Builds data products for the Chief Medical Officer (CMO), with a strong focus on delivering actionable insights.
- Develops Tableau dashboards on top of these data products to enable visualization and reporting.
- Operates through two development pods, each working on separate data models.
- Currently addressing significant technical debt by consolidating data into an enterprise data zone for better leverage.
- Goal is to clean and standardize data to support downstream systems and create a scalable, enterprise-ready solution.
Tech Stack:
- DBT
- Snowflake
- GIT
- EPIC - Clinical Data - Time you were admitted to the hospital to the time you leave the hospital. Not claims EPIC DATA.
- Clarity - must have
- Caboodle. - must have.
- Data Profiling, bleeding a tad into the analyst realm
- Nice to have:
- CI/CD pipeline experience
- Healthcare and/or clinical data
- Epic
- Azure DevOps
- Data Modeling
Principal Duties and Responsibilities |
- Perform EDW ETL/ELT ingestions and integrations to ensure support data and analytic needs.
- Build and enhance standard solution that provides efficient and scalable ETL/ELT solutions from multiple data sources.
- Ensure the quality of data assets and robustness of data engineering processes.
- Experience with change control, release management and other ITIL processes.
- Participate in building out EDW on Snowflake, expanding and optimizing the data ecosystem, as well as optimizing data engineering processes.
- Support BI Developers, Data Architects, Data Analysts and Data Scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.
- Conduct ETL solution, code, and pre-prod review to ensure optimal data delivery architecture is consistent throughout ongoing projects.
- Perform troubleshooting on ETLs and related components.
- Identify application bottlenecks and opportunities to optimize performance.
Qualifications:
- 7+ plus years of experience designing and building data ingestions and data integration solution for Enterprise Data and Analytics Solutions
- 3+ years of experience with developing data pipelines using on Snowflake features ( Snowpipe, SnowSQL, Snow Sight, Data Streams )
- 2+ years of experience with developing models in DBT
- Demonstrate good knowledge of scrum & agile principles
- Experience with Azure DevOps, Azure Git and CI/CD data pipeline integrations.
- Demonstrates good knowledge of cloud computing platforms such as Azure
- Experience of support and enhance existing Enterprise Data Warehouse or Data Lake
- Ensure designed systems are highly reliable, self-recovering, and require little or no supporting manpower
- Familiar with change control, release management, and other ITIL methodology
- Deep experience in traditional Data Warehousing & Cloud data warehousing
- Mastery of SQL, especially within cloud-based data warehouses like Snowflake
- Experience developing data pipelines using Snowflake features (Snow pipe, SnowSQL, Data Streams, Snow Sight)
- Experience with logical and physical data modeling
- Healthcare experience, most notably in Clinical data, Epic, Payer data and reference data is a plus but not mandatory.
- Ability to clearly and concisely communicate complex technical concepts to both technical and non-technical audiences
- Proven verbal, communication, and presentation skills
- Proven ability to work independently.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.