Senior Microsoft Fabric Data Engineer

Hybrid in Dallas, TX, US • Posted 8 days ago • Updated 8 days ago
Contract Corp To Corp
Contract W2
Contract Independent
On-site
Fitment

Dice Job Match Score™

🧠 Analyzing your skills...

Job Details

Skills

  • Python
  • SQL
  • Powershell

Summary

Position: Senior Microsoft Fabric Data Engineer

Location: Dallas, TX (Hybrid/Remote)

Duration: 6+ Months
Core Engineering
  • Advanced SQL (T SQL) for large-scale data processing, partitioning, indexing, query tuning, and relational systems
  • Data Modeling: 3NF, Star/Snowflake, Data Vault, conformed dimensions, and SCD strategies
  • ETL/ELT development using Fabric Data Pipelines, Dataflows Gen2, and/or Azure Data Factory
  • Lakehouse & Warehouse development in Microsoft Fabric (OneLake, Delta/Parquet, shortcuts)
  • PowerShell, Python and/or .NET (C#) for data processing, custom connectors, utilities, and SDK usage
  • Power Query (M)/DAX for robust ingestion, schema drift handling, and reusable transformations
Automation & DevOps
  • CI/CD for BI & Data (Fabric pipelines, Power BI deployment pipelines, Azure DevOps or GitHub Actions)
  • Infrastructure as Code (IaC) using Bicep/ARM/Terraform for Azure resources
  • Advanced scripting automations and automated testing for data pipelines (transform validations, data contract checks)
  • Version control & release management (Git, branching strategies, semantic versioning)
  • Job orchestration: scheduling, dependencies, retries, alerts, SLAs/SLOs (Ansible)
Quality, Governance & Reliability
  • Data quality frameworks: validation rules, anomaly detection, reconciliation checks
  • Observability: logging, lineage, metrics, cost/usage monitoring
  • Data governance: metadata management, cataloging, sensitivity labels, PII handling, lifecycle policies
  • Security-first mindset: least privilege, Key Vault, Private Endpoints, RLS/OLS awareness
Database, Reporting & Integration
  • Microsoft Fabric ecosystem experience (Lakehouse, Warehouse, Pipelines, OneLake)
  • API integration expertise: REST/GraphQL APIs, pagination, throttling, authentication flows
  • Experience with data ingestion patterns and integrating diverse data sources (SQL, SaaS, files, SharePoint, APIs)
Expert Power BI Development (if BI responsibilities are shared or overlapping)
  • Advanced dashboard/report development with modern, executive-ready UX
  • Dynamic visuals: bookmarks, drill through, field parameters
  • RLS/OLS implementation and optimization
Analytics & Data Science Adjacent Skills
  • Statistical analysis & business analytics foundations
  • Ability to translate ambiguous business questions into measurable KPIs
  • Understanding of data quality, lineage, and governance best practices
  • Ability to design KPI frameworks and executive level metrics
Agile & Emerging Technologies
  • Comfortable working in Agile delivery environments
  • Exposure to generative AI and prompt engineering (nice plus)
Nice-to-Have
  • Exposure to MLOps or predictive modeling
Technical Skills
Core Engineering
  • Advanced SQL (T SQL) for large-scale data processing, partitioning, indexing, query tuning, and relational systems
  • Data Modeling: 3NF, Star/Snowflake, Data Vault, conformed dimensions, and SCD strategies
  • ETL/ELT development using Fabric Data Pipelines, Dataflows Gen2, and/or Azure Data Factory
  • Lakehouse & Warehouse development in Microsoft Fabric (OneLake, Delta/Parquet, shortcuts)
  • PowerShell, Python and/or .NET (C#) for data processing, custom connectors, utilities, and SDK usage
  • Power Query (M)/DAX for robust ingestion, schema drift handling, and reusable transformations
Automation & DevOps
  • CI/CD for BI & Data (Fabric pipelines, Power BI deployment pipelines, Azure DevOps or GitHub Actions)
  • Infrastructure as Code (IaC) using Bicep/ARM/Terraform for Azure resources
  • Advanced scripting automations and automated testing for data pipelines (transform validations, data contract checks)
  • Version control & release management (Git, branching strategies, semantic versioning)
  • Job orchestration: scheduling, dependencies, retries, alerts, SLAs/SLOs (Ansible)
Quality, Governance & Reliability
  • Data quality frameworks: validation rules, anomaly detection, reconciliation checks
  • Observability: logging, lineage, metrics, cost/usage monitoring
  • Data governance: metadata management, cataloging, sensitivity labels, PII handling, lifecycle policies
  • Security-first mindset: least privilege, Key Vault, Private Endpoints, RLS/OLS awareness
Database, Reporting & Integration
  • Microsoft Fabric ecosystem experience (Lakehouse, Warehouse, Pipelines, OneLake)
  • API integration expertise: REST/GraphQL APIs, pagination, throttling, authentication flows
  • Experience with data ingestion patterns and integrating diverse data sources (SQL, SaaS, files, SharePoint, APIs)
Expert Power BI Development (if BI responsibilities are shared or overlapping)
  • Advanced dashboard/report development with modern, executive-ready UX
  • Dynamic visuals: bookmarks, drill through, field parameters
  • RLS/OLS implementation and optimization
Analytics & Data Science Adjacent Skills
  • Statistical analysis & business analytics foundations
  • Ability to translate ambiguous business questions into measurable KPIs
  • Understanding of data quality, lineage, and governance best practices
  • Ability to design KPI frameworks and executive level metrics
Agile & Emerging Technologies
  • Comfortable working in Agile delivery environments
  • Exposure to generative AI and prompt engineering (nice plus)
Nice-to-Have
  • Exposure to MLOps or predictive modeling
Soft Skills
  • Communication:
    • Able to explain technical concepts to non-technical stakeholders.
    • Strong collaboration across cross-functional teams.
  • Project Management:
    • Skilled in managing data-related projects and prioritizing tasks.
    • Detail-oriented with a focus on data accuracy and validation.
  • Teamwork & UX Awareness:
    • Effective in working with data engineers, analysts, and business users.
    • Understanding of user experience and UX measurement is a plus.
Analytical & Personal Attributes
  • Strong analytical and problem-solving skills.
  • Self-driven, proactive, and results oriented.
  • Reliable, structured, and organized.
  • Curious, service-minded, and adaptable.
  • Comfortable working under pressure and in diverse environments.
Descriptions of Responsibilities within the Role:
  • Data engineering
  • Data integration from different sources
  • Analyze backend reporting systems and translate findings into technical terms.
  • Develop automation pipelines
  • Develop and maintain Power BI dashboards as work progresses.
  • Ensure transparency of work status and progress.
  • Collaborate with the Data Warehouse (DWH) team to industrialize AUXi data and reporting solutions.
  • Produce and maintain technical documentation throughout all project phases.
  • Actively contribute ideas, share knowledge, and mentor others within the team and adjacent ones.
  • Follow AUXi's way of working, including acceptance criteria and Definition of Done (DoD).
  • Participate in all Scrum ceremonies and be available during agreed working hours
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10517743
  • Position Id: 2026-8663
  • Posted 8 days ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Hybrid in Dallas, Texas

Yesterday

Easy Apply

Contract

Depends on Experience

Irving, Texas

Today

Easy Apply

Contract

$60 - $65

Dallas, Texas

Yesterday

Easy Apply

Contract

Depends on Experience

Irving, Texas

Today

Easy Apply

Contract

$55 - $65

Search all similar jobs