Denver, Colorado
•
Today
Experience Required - 6+ Years Must Have Technical/Functional Skills Design, build, and orchestrate ETL/ELT pipelines using Azure Databricks. Implement batch data ingestion and transformations using PySpark and Spark SQL. Architect and maintain Lakehouse and analytical warehouse models (fact and dimension schemas) leveraging Delta Lake. Ensure data quality, reliability, lineage, and governance across the data platform. Collaborate with security and platform teams to enforce data access controls
Easy Apply
Full-time
Depends on Experience


















