Seattle, Washington
•
Today
Seattle WA Job Description: >> Design and implement data pipelines in Azure Databricks for ingesting and transforming data from upstream systems >> Optimize ETLELT workflows for performance and scalability >> Collaborate with JavaAPI developers to integrate event-driven triggers into data pipelines >> Implement data quality checks, schema validation, and error handling >> Support batch and near-real-time data flows for operational and analytics use cases >> Work with Boomi and WFM teams to ensu
Third Party, Contract
Depends on Experience











