Overview
On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Skills
Microsoft Fabric
Fabric
Data Engineer
Apache Spark
Data Quality
Agile
Job Details
Role Summary:
Builds, maintains, and optimizes data ingestion pipelines, Lakehouse tables, transformation logic, data quality checks, and operational workflows within Microsoft Fabric. 47
Key Responsibilities:
- Architect and maintain medallion data frameworks.
- Build ingestion pipelines using Data Factory, Spark Notebooks, Dataflows Gen2, Event Streams, and Lakehouse connectors.
- Develop Silver/Gold transformations including cleansing, enrichment, merging, and standardization.
- Implement data quality checks, validation rules, auditing columns, and error-handling patterns.
- Optimize Lakehouse and Delta Lake operations (schema evolution, partitioning, Z-Order, caching, vacuum, compaction).
- Support production operations including monitoring, incident triage, defect resolution, and performance tuning
- Create/Maintain mapping sheets, schema documentation, lineage diagrams, and runbooks.
- Coordinate with analytics teams to prepare model-ready Gold datasets.
- Participate in Agile ceremonies, design reviews, and technical planning.
- Support UAT, test data preparation, and validation activities.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.