ETL / Snowflake Developer with hands-on experience in IBM DataStage, Snowflake Cloud Data Platform, and Business Intelligence (BI) Administration within the Healthcare Payer industry.
This role involves building and maintaining robust data integration pipelines, administering BI environments, and supporting analytics use cases across claims, member, provider, and utilization management domains. The ideal candidate will have a strong background in ETL design, data warehousing, data modeling, and Azure-based data ecosystems, combined with working knowledge of payer data standards (e.g., HIPAA, X12, EDI, HL7, etc.).
This role focuses on building scalable data pipelines, integrating multiple healthcare data sources, optimizing Snowflake performance, and ensuring data quality, lineage, and compliance across analytical and operational systems. The developer will collaborate with data architects, analysts, and DevOps engineers to modernize legacy ETL processes and enable advanced analytics and interoperability initiatives.
Design, develop, and maintain ETL processes using IBM InfoSphere DataStage and Snowflake to support enterprise data warehouse (EDW) and analytics solutions. Develop data pipelines to extract, transform, and load data from multiple source systems including claims, eligibility, provider, and clinical data. Implement data quality checks, audit controls, and error-handling within ETL workflows. Optimize DataStage jobs for performance, parallelization, and scalability. Transition legacy ETL pipelines to modern Azure Data Factory or Snowflake-native ELT solutions as part of modernization initiatives. Design and manage Snowflake schemas, virtual warehouses, and data sharing capabilities. Optimize query performance using clustering keys, caching, and warehouse sizing strategies. Implement role-based access controls (RBAC), masking policies, and data governance for PHI and PII data. Develop time-travel, zero-copy cloning, and data retention strategies for efficient data lifecycle management. Leverage Snowpipe and Streams/Tasks for real-time or near-real-time data ingestion and change data capture (CDC).
Administer BI platforms such as Power BI, Tableau, or Cognos, including user access, report deployment, and performance optimization. Work with business analysts and data scientists to ensure accurate, timely, and reliable data availability for dashboards and models. Create and maintain data dictionaries, lineage documentation, and metadata. Monitor ETL and BI workloads, troubleshoot failures, and implement proactive performance tuning. Support data governance and data quality initiatives, ensuring consistency across reporting layers. Technical Skills: ETL / ETL Tools: Azure Data Factory (ADF), Informatica, SSIS, or Matillion Cloud Platforms: Microsoft Azure (Data Lake, Synapse, Functions, Key Vault, Logic Apps) Databases: Snowflake, SQL Server, Azure SQL, Oracle (as source systems) Languages: Advanced SQL, Python, or Scala for scripting and automation Data Modeling: Star schema, Snowflake schema, 3NF, Data Vault Integration: REST APIs, FHIR, HL7, EDI (HIPAA X12 837, 835, 270/271, 278) Version Control / DevOps: Git, Azure DevOps Pipelines, GitHub Actions Monitoring / Logging: Azure Monitor, Snowflake Query History, or custom monitoring frameworks |