Position: Senior Microsoft Fabric Data Engineer
Location: Dallas, TX (Hybrid/Remote)
Duration: 6+ Months
Core Engineering
- Advanced SQL (T SQL) for large-scale data processing, partitioning, indexing, query tuning, and relational systems
- Data Modeling: 3NF, Star/Snowflake, Data Vault, conformed dimensions, and SCD strategies
- ETL/ELT development using Fabric Data Pipelines, Dataflows Gen2, and/or Azure Data Factory
- Lakehouse & Warehouse development in Microsoft Fabric (OneLake, Delta/Parquet, shortcuts)
- PowerShell, Python and/or .NET (C#) for data processing, custom connectors, utilities, and SDK usage
- Power Query (M)/DAX for robust ingestion, schema drift handling, and reusable transformations
Automation & DevOps
- CI/CD for BI & Data (Fabric pipelines, Power BI deployment pipelines, Azure DevOps or GitHub Actions)
- Infrastructure as Code (IaC) using Bicep/ARM/Terraform for Azure resources
- Advanced scripting automations and automated testing for data pipelines (transform validations, data contract checks)
- Version control & release management (Git, branching strategies, semantic versioning)
- Job orchestration: scheduling, dependencies, retries, alerts, SLAs/SLOs (Ansible)
Quality, Governance & Reliability
- Data quality frameworks: validation rules, anomaly detection, reconciliation checks
- Observability: logging, lineage, metrics, cost/usage monitoring
- Data governance: metadata management, cataloging, sensitivity labels, PII handling, lifecycle policies
- Security-first mindset: least privilege, Key Vault, Private Endpoints, RLS/OLS awareness
Database, Reporting & Integration
- Microsoft Fabric ecosystem experience (Lakehouse, Warehouse, Pipelines, OneLake)
- API integration expertise: REST/GraphQL APIs, pagination, throttling, authentication flows
- Experience with data ingestion patterns and integrating diverse data sources (SQL, SaaS, files, SharePoint, APIs)
Expert Power BI Development (if BI responsibilities are shared or overlapping)
- Advanced dashboard/report development with modern, executive-ready UX
- Dynamic visuals: bookmarks, drill through, field parameters
- RLS/OLS implementation and optimization
Analytics & Data Science Adjacent Skills
- Statistical analysis & business analytics foundations
- Ability to translate ambiguous business questions into measurable KPIs
- Understanding of data quality, lineage, and governance best practices
- Ability to design KPI frameworks and executive level metrics
Agile & Emerging Technologies
- Comfortable working in Agile delivery environments
- Exposure to generative AI and prompt engineering (nice plus)
Nice-to-Have
- Exposure to MLOps or predictive modeling
Technical Skills
Core Engineering
- Advanced SQL (T SQL) for large-scale data processing, partitioning, indexing, query tuning, and relational systems
- Data Modeling: 3NF, Star/Snowflake, Data Vault, conformed dimensions, and SCD strategies
- ETL/ELT development using Fabric Data Pipelines, Dataflows Gen2, and/or Azure Data Factory
- Lakehouse & Warehouse development in Microsoft Fabric (OneLake, Delta/Parquet, shortcuts)
- PowerShell, Python and/or .NET (C#) for data processing, custom connectors, utilities, and SDK usage
- Power Query (M)/DAX for robust ingestion, schema drift handling, and reusable transformations
Automation & DevOps
- CI/CD for BI & Data (Fabric pipelines, Power BI deployment pipelines, Azure DevOps or GitHub Actions)
- Infrastructure as Code (IaC) using Bicep/ARM/Terraform for Azure resources
- Advanced scripting automations and automated testing for data pipelines (transform validations, data contract checks)
- Version control & release management (Git, branching strategies, semantic versioning)
- Job orchestration: scheduling, dependencies, retries, alerts, SLAs/SLOs (Ansible)
Quality, Governance & Reliability
- Data quality frameworks: validation rules, anomaly detection, reconciliation checks
- Observability: logging, lineage, metrics, cost/usage monitoring
- Data governance: metadata management, cataloging, sensitivity labels, PII handling, lifecycle policies
- Security-first mindset: least privilege, Key Vault, Private Endpoints, RLS/OLS awareness
Database, Reporting & Integration
- Microsoft Fabric ecosystem experience (Lakehouse, Warehouse, Pipelines, OneLake)
- API integration expertise: REST/GraphQL APIs, pagination, throttling, authentication flows
- Experience with data ingestion patterns and integrating diverse data sources (SQL, SaaS, files, SharePoint, APIs)
Expert Power BI Development (if BI responsibilities are shared or overlapping)
- Advanced dashboard/report development with modern, executive-ready UX
- Dynamic visuals: bookmarks, drill through, field parameters
- RLS/OLS implementation and optimization
Analytics & Data Science Adjacent Skills
- Statistical analysis & business analytics foundations
- Ability to translate ambiguous business questions into measurable KPIs
- Understanding of data quality, lineage, and governance best practices
- Ability to design KPI frameworks and executive level metrics
Agile & Emerging Technologies
- Comfortable working in Agile delivery environments
- Exposure to generative AI and prompt engineering (nice plus)
Nice-to-Have
- Exposure to MLOps or predictive modeling
Soft Skills
- Communication:
- Able to explain technical concepts to non-technical stakeholders.
- Strong collaboration across cross-functional teams.
- Project Management:
- Skilled in managing data-related projects and prioritizing tasks.
- Detail-oriented with a focus on data accuracy and validation.
- Teamwork & UX Awareness:
- Effective in working with data engineers, analysts, and business users.
- Understanding of user experience and UX measurement is a plus.
Analytical & Personal Attributes
- Strong analytical and problem-solving skills.
- Self-driven, proactive, and results oriented.
- Reliable, structured, and organized.
- Curious, service-minded, and adaptable.
- Comfortable working under pressure and in diverse environments.
Descriptions of Responsibilities within the Role:
- Data engineering
- Data integration from different sources
- Analyze backend reporting systems and translate findings into technical terms.
- Develop automation pipelines
- Develop and maintain Power BI dashboards as work progresses.
- Ensure transparency of work status and progress.
- Collaborate with the Data Warehouse (DWH) team to industrialize AUXi data and reporting solutions.
- Produce and maintain technical documentation throughout all project phases.
- Actively contribute ideas, share knowledge, and mentor others within the team and adjacent ones.
- Follow AUXi's way of working, including acceptance criteria and Definition of Done (DoD).
- Participate in all Scrum ceremonies and be available during agreed working hours