Data Warehouse Engineer
Remote, Minnetonka, MN
Phone + Skype
Job description:
Top Skills
Highest Priority / Must Have
ADLS Gen2 with Apache Iceberg
dbt on Snowflake
Python / PySpark (Snowflake / Microsoft Fabric-aligned)
Snowflake-based data warehousing and analytics engineering
• Additional Required Experience
Data Warehouse / ETL development using Informatica
Data analysis experience (not limited to ETL development)
Strong data quality focus and metrics-driven mindset
Strong SQL skills and cloud data engineering experience (Azure preferred)
• Supporting / Preferred Tools
Kafka
GoldenGate
IDMC
Azure Data Factory
Airflow
Metadata-driven pipeline management
Environment / Tech Stack
Informatica
Snowflake
Azure (ADLS Gen2, Azure Data Factory, Microsoft Fabric)
Kafka, GoldenGate
dbt (Snowflake), IDMC
PySpark, Airflow
Metadata-driven pipeline management tools
Preferred Background
5+ years of experience in software development and database applications, focused on data strategy, modeling, integration, and architecture
Deep experience designing and building data warehouses, data marts, and ODS solutions
Strong SQL, query optimization, and RDBMS design (3NF and dimensional models)
End-to-end Snowflake implementation experience (RBAC, performance tuning, cloning, optimization)
Healthcare payer data experience (member, enrollment, claims, provider) preferred
Cloud platform experience with Azure and/or AWS
Nice to Have
CDMP or CBIP certification
Experience with ERwin, ER/Studio, or PowerDesigner
Informatica preferred (DataStage or SSIS acceptable)
Agile and/or ITIL certification.
(“Believe you can and you’re halfway there.”)
– Theodore Roosevelt
Yogesh Sharma | Lead Tech Recruiter
An -E Verified Company
E:
P: +1 |