Fabric Developer

Phoenix, AZ, US • Posted 4 days ago • Updated 4 days ago
Contract W2
Contract Corp To Corp
On-site
$60 - $70/hr
Fitment

Dice Job Match Score™

👤 Reviewing your profile...

Job Details

Skills

  • Microsoft Fabric
  • Data Factory
  • Lakehouse
  • OneLake
  • PySpark
  • Spark
  • sql
  • ELT pipelines
  • Delta Tables

Summary

Position Title: Fabric Developer / Sr. Data Engineer

Location: Phoenix, AZ (Onsite 5 days) - only Arizona candidates - Locals - No Relocation

Years of Experience: 8+ years

Responsibilities -

  • Builds and maintains ELT/ETL pipelines using Microsoft Fabric tools, enabling efficient data ingestion from multiple resources.
  • Applies transformations, cleanses, and enriches data to ensure it is ready for analyzing and reporting
  • Handles large datasets, optimizing storage and retrieval for performance.
  • Implements automation for data processing and integration workflows, reducing manual intervention
  • Works with Platform Architects to ensure infrastructure supports data requirements.
  • Partners with Report developers to ensure that data is in a usable format and ready for analysis.
  • Ensuring code reusability and parameterization
  • Focuses on creating interactive, intuitive reports and dashboards using Microsoft Fabric's reporting tools.

Qualifications:

  • Data Factory (in Fabric): Designing and orchestrating data ingestion and transformation pipelines (ETL/ELT).
  • Data Engineering Experience (Spark): Using Notebooks (PySpark, Spark SQL, Scala) and Spark Job Definitions for complex data processing, cleansing, enrichment, and large-scale transformations directly on OneLake data.
  • Lakehouse Items: Creating and managing Lakehouse structures (Delta tables, files) as the primary landing and processing zone within OneLake.
  • OneLake / ADLS Gen2: Understanding storage structures, Delta Lake format, partitioning strategies, and potentially managing Shortcuts.
  • Monitoring Hubs: Tracking pipeline runs and Spark job performance.
  • Core Responsibilities (Fabric Context): Building ingestion pipelines from diverse sources; implementing data cleansing and quality rules; transforming raw data into curated Delta tables within Lakehouses or Warehouses; optimizing Spark jobs and data layouts for performance and cost; managing pipeline schedules and dependencies; ensuring data security and governance principles are applied to pipelines and data structures.
  • Excellent Communicator and collaboration skills
  • Bachelor s degree in Computer Science, Engineering, or a relevant field
  • Azure/AWS Cloud Certifications will be a plus
  • Experience with Manufacturing domain will be a plus.
  • Should be self-driven and be able to drive the projects to delivery
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10119106
  • Position Id: 8925372
  • Posted 4 days ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Phoenix, Arizona

4d ago

Easy Apply

Contract

$75

Phoenix, Arizona

Today

Contract

$65 - $75 hourly

Phoenix, Arizona

12d ago

Easy Apply

Contract, Third Party

Depends on Experience

Hybrid in Phoenix, Arizona

Today

Easy Apply

Contract, Third Party

Depends on Experience

Search all similar jobs