Data Engineer with Fabric exp
Remote role
Contract Job description Requirements Gathering Solution Design
Engage with business stakeholders to understand analytical operational and compliance needs
Convert business requirements into functional design sourcetotarget mappings transformation logic and technical specifications
Validate requirements against enterprise data models and recommend architecture patterns Lakehouse Warehouse Realtime Hub
Data Modeling Fabric Semantic Layer
Design build and govern Fabric Semantic Models Direct Lake Import DQ Hybrid modes
Define enterprisewide canonical models shared dimensions hierarchies KPIs and reusable DAX measures
Optimize semantic models for performance using aggregations incremental refresh and partitioning strategies
Enable certified datasets semantic governance and rolelevel security within Fabric
ETLELT Engineering with Fabric Pipelines Dataflows Notebooks
Build ingestion and transformation processes using Data Factory pipelines Dataflows Gen2 Warehouse pipelines and PySpark notebooks
Maintain metadatadriven ETL patterns and reusable frameworks for ingestion harmonization and transformation
Fabric Notebook Engineering
Use Fabric Notebooks to perform
oPySpark transformations
oDelta Lake optimization Zorder vacuuming partitioning
oData validation and data quality checks
oML feature engineering and lightweight model operations
Automate notebook execution via pipelines triggers and Fabric scheduling
Integrate notebooks with Lakehouse tables Warehouse tables and ML model outputs
Near RealTime NRT Data Processing in Fabric
Design and implement nearrealtime data ingestion pipelines using
oFabric RealTime Hub
oEvent Streams
oKQL Databases
oStreaming Dataflows
Build streaming transformations and realtime analytical models leveraging Kusto Query Language KQL and PySpark Structured Streaming
Ensure lowlatency ingestion to LakehouseWarehouse for downstream consumption
Optimize realtime workloads for durability recovery and performance under high throughput
Build dashboards and semantic models that support nearrealtime refresh scenarios
Performance Optimization
Optimize SQL queries Lakehouse Delta tables semantic models DAX expressions and Power BI datasets
Review and tune pipeline throughput notebook execution performance and refresh schedules
Improve Direct Lake performance by optimizing storage layouts file size distributions and columns
Perform workload monitoring using Fabric capacity metrics and logs
Power BI Development Visualization
Build highquality Power BI dashboards and enterprise reports integrated with centralized semantic models
Develop DAX calculations KPIs UXUI standards drillthroughs and rowlevel security
Drive semantic model reuse and promote governed gold datasets
Governance Security Compliance Purview Integration
Implement data governance and cataloging using Microsoft Purview for Fabric assets
Manage lineage tracking glossary management classification and metadata enrichment
Define enterprise security controls RBAC masking PII handling encryption retention policies
Ensure compliance with GDPR CCPA HIPAA SOX and internal audit controls
Govern Fabric workspace structure capacity usage data certification processes and lifecycle management
Technical Leadership Program Delivery
Lead data engineers BI developers and analysts across multiple initiatives
Review designs STTMs code semantic models and performance benchmarks
Own sprint planning estimation milestone tracking and stakeholder communication
Promote documentation technical standards reusable design frameworks and automation
Required Skills Experience
Technical Expertise
8 years of data engineering or BI experience with 2 years in Microsoft Fabric
Expert in data warehousing design dimensional modeling semantic modeling and data governance
Strong handson experience with
oFabric Lakehouses Warehouses
oFabric Semantic Models Direct Lake Import Hybrid
oRealTime Hub Event Streams KQL Databases
oNotebooks PySpark data transformations optimization
oData Factory pipelines Dataflows Gen2
oPower BI modeling DAX report development
Solid understanding of Delta Lake Spark performance tuning and workload optimization
Expertise in implementing sourcetotarget mappings transformation logic and validation rules
Preferred Skills
Azure Synapse ADF ADLS Databricks experience
Knowledge of DataOps CICD Git DevOps pipelines unit testing
Familiarity with Fabric AI Copilot for Power BI and AIdriven engineering accelerators
Soft Skills
Excellent communication articulation and stakeholder management
Strong leadership skills and the ability to mentor others
Problemsolving mindset with a focus on scalability efficiency and accuracy
Skills
Mandatory Skills : Microsoft Fabric - Warehousing, Microsoft Fabric - Data Engineering