Microsoft Fabric Developer

  • Minneapolis, MN
  • Posted 1 day ago | Updated 1 day ago

Overview

On Site
$50 - $60
Accepts corp to corp applications
Contract - W2
Contract - 6 Month(s)

Skills

ETL
Microsoft Fabric
Lakehouse
Warehouse
Snowflake

Job Details

Role Description

  • Responsible for understanding the requirements and perform data analysis.

  • Responsible for setup of Microsoft Fabric and its components.

  • Building secure, scalable solutions across the Microsoft Fabric platform.

  • Create and manage Lakehouse.

  • Implement Data Factory processes for data ingestion, scalable ETL and data integration.

  • Design, implement and manage comprehensive warehousing solutions for analytics using Fabric.

  • Creating and scheduling data pipelines using Azure Data Factory.

  • Building robust data solutions using Microsoft data engineering tools like Notebook, Lakehouse and Spark application.

  • Build and automate deployment pipelines using CICD tools for the release of Fabric content from lower to higher environments.

  • Set and use Git as a repository and versioning of Fabric components.

  • Create and manage Power BI reports and semantic models.

  • Write and optimize complex SQL queries to extract and analyze data, ensuring data processing and accurate reporting.

  • Work closely with customers, business analysts and technology project team to understand business requirements, drive the analysis and design of quality technical solutions that are aligned with business and technology strategies and comply with the organization s architectural standards.

  • Understand and follow-up through change management procedures to implement project deliverables.

  • Coordinating with support groups to get the issues resolved in a quick turnaround time.


Mandatory

  • Bachelor s degree in computer science or similar field or equivalent work experience.

  • 3 years of experience working in Microsoft Fabric.

  • Expertise in working with OneLake, Lakehouse, Warehouse and Notebook.

  • Strong understanding of Power BI reports and semantic model using Fabric.

  • Proven record of building ETL and data solutions using Azure Data Factory.

  • Strong understanding of data warehousing concepts and ETL processes.

  • Hands-on experience of building data warehouses in Fabric.

  • Strong skills in Python and PySpark.

  • Practical experience of implementing Spark in Fabric, scheduling Spark jobs, writing Spark SQL queries.

  • Experience of utilizing Data Activator for effective data asset management and analytics.

  • Ability to flex and adapt to different tools and technologies.

  • Strong learning attitude.

  • Good written and verbal communication skills.

  • Demonstrated experience of working in a team spread across multiple locations.


Preferable

  • Knowledge of AWS services.

  • Knowledge of Snowflake.

  • Knowledge of real time analytics in Fabric.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.