Fabric Data Engineer (Remote)

Remote in Dallas, TX, US • Posted 14 hours ago • Updated 2 hours ago
Contract W2
On-site
USD60 - USD72/hr
Fitment

Dice Job Match Score™

🔗 Matching skills to job...

Job Details

Skills

  • Artificial Intelligence
  • RESTful
  • Apache Parquet
  • Auditing
  • Meta-data Management
  • Data Deduplication
  • Normalization
  • Dimensional Modeling
  • Analytics
  • Continuous Integration
  • Continuous Delivery
  • Git
  • Documentation
  • Caching
  • Apache Spark
  • Computer Science
  • Data Engineering
  • Microsoft
  • Microsoft Power BI
  • Data Analysis
  • Python
  • PySpark
  • SQL
  • Alteryx
  • Microsoft SharePoint
  • Microsoft Outlook
  • Microsoft Exchange
  • Data Modeling
  • Star Schema
  • Data Lake
  • Data Quality
  • Cloud Computing
  • Problem Solving
  • Conflict Resolution
  • Collaboration
  • Communication
  • Databricks
  • Microsoft Azure
  • API
  • Authentication
  • OAuth
  • Data Governance
  • Data Masking
  • Streaming
  • Extract
  • Transform
  • Load
  • Health Care
  • Finance
  • Data Domain

Summary

Fabric Data Engineer (Remote)

We are seeking a skilled Data Engineer with strong Python/PySpark expertise to build and maintain scalable cloud-based pipelines within Azure environments such as Microsoft Fabric or Synapse.

As part of our process after applying, you may receive an invitation from our AI Recruiter Avery for a short conversation that lets you share more about your background beyond your resume. For questions, contact .


  • Job Type: Long-term contract
  • Compensation: This job expects to pay about $60 - $72 per hour plus benefits
  • Location: Remote
  • No C2C. No Visa Sponsorship Available for this role

What You ll Do:


  • Design, develop, test, and maintain PySpark notebooks to ingest data from: SharePoint (REST/Graph API), Outlook (Graph/REST API), and 3rd party REST APIs
  • Bronze layer (raw): persist ingestion in Delta Lake/Parquet with minimal transformations, preserving schema drift and auditing metadata
  • Silver layer (curated): apply data quality checks, deduplication, normalization, standardization, and business-key alignment
  • Implement and maintain a star-schema data model (facts and dimensions) for analytics
  • Build and maintain CI/CD for data pipelines (Git, unit tests, deployment to Databricks/Azure Synapse, etc.)
  • Implement monitoring, alerting, retries, and idempotent ingest strategies
  • Data governance: lineage, masking of PII/PII-sensitive fields, role-based access
  • Documentation: pipeline design docs, data dictionaries, and runbooks
  • Collaborate with Data Analysts, Data Scientists, and Business Stakeholders to translate requirements into scalable pipelines
  • Optimize performance (partitioning, caching, Delta tables, spark configurations) and control costs

What Gets You the Job:


  • Bachelor's or Master s in Computer Science, Data Engineering, or related field


  • 4+ years of experience as a Data Engineer
  • Experience in Microsoft Fabric is highly preferred or experience with Synapse, Power BI or Data Analytics is acceptable
  • Must be proficient in Python / PySpark notebook
  • Strong data ingestion skills are required
  • Must have strong SQL skills
  • Experience with Alteryx / MapLogic
  • Experience ingesting data from REST APIs (SharePoint/Graph API, Outlook/Exchange, 3rd party APIs)
  • Demonstrated experience with Bronze/Silver/Gold data modeling, and star schema design
  • Experience with Delta Lake / data lake architectures
  • Familiarity with data quality frameworks and profiling
  • Experience with cloud data platforms (Azure preferred: Databricks, ADLS Gen2, Synapse)
  • Strong problem-solving, collaboration, and communication skills

Nice-to-Have


  • Databricks certification or Azure Data Engineer Associate
  • Experience with Graph API authentication (OAuth2), service principals
  • Data governance tools and data masking concepts
  • Experience with streaming/batch hybrid ETL patterns
  • Knowledge of healthcare/finance data domain or other regulated data

Irvine Technology Corporation (ITC) connects top talent with exceptional opportunities in IT, Security, Engineering, and Design. From startups to Fortune 500s, we partner with leading companies nationwide. Our AI recruiter, Avery helps streamline the first step of your journey so we can focus on what matters most: helping you grow. Join us. Let us ELEVATE your career!

Irvine Technology Corporation provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Irvine Technology Corporation complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: itcca
  • Position Id: 23236
  • Posted 14 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Irving, Texas

28d ago

Easy Apply

Contract

$55 - $60

Dallas, Texas

Yesterday

Easy Apply

Contract

Depends on Experience

Dallas, Texas

Today

Contract

USD 130,000.00 - 150,000.00 per year

Hybrid in Dallas, Texas

17d ago

Easy Apply

Contract

Depends on Experience

Search all similar jobs