Azure Data Engineer with Databricks & medallion architecture

Remote • Posted 13 hours ago • Updated 12 hours ago
Contract W2
Contract Independent
No Travel Required
Remote
$70 - $75/hr
Fitment

Dice Job Match Score™

⏳ Almost there, hang tight...

Job Details

Skills

  • Data Governance
  • Data Engineering
  • Databricks
  • ELT
  • Machine Learning Operations (ML Ops)
  • Python
  • SQL
  • medallion architecture

Summary

Data Engineer

Remote
 
 

Role Summary

As a Data Engineer on the team, you will design, build, and operate high-quality data pipelines and platforms that power advanced analytics, optimization models, and enterprise dashboards. You will work at the intersection of data engineering, analytics engineering, and platform enablement—supporting everything from proof-of-concept workflows to production-grade, high-SLA pipelines running on Databricks.

Key Responsibilities

  • Design, build, and maintain scalable data pipelines for ingestion, transformation, and serving of analytics-ready datasets across multiple regions and business domains.
  • Partner with data scientists, optimization engineers, and business stakeholders to translate analytical requirements into clean, reusable data models and curated datasets.
  • Productionalize analytics and model workflows by implementing robust, monitored, and well-governed pipelines that meet enterprise standards for reliability and performance.
  • Support end-to-end lifecycle execution, from MVP through production and run-and-maintain, including refactoring legacy pipelines and enabling new use cases.
  • Implement best practices in data quality, testing, versioning, and CI/CD for data pipelines and analytics workflows.
  • Enable platform capabilities on cloud-native analytics stacks (e.g., Databricks, Unity Catalog), including environment setup, access control, and governance.
  • Troubleshoot data issues, perform root-cause analysis, and continuously improve pipeline performance and maintainability.
  • Collaborate cross-functionally with IT, platform teams, and global business partners to integrate solutions with enterprise systems (e.g., planning, supply chain, commercial tools).

Required Qualifications

  • Bachelor’s degree in Computer Science, Engineering, Data Science, or a related quantitative field, with 3+ years of experience in data engineering.
  • Strong experience building and maintaining data pipelines using Python and SQL in a cloud environment.
  • Hands-on experience with modern data platforms, particularly Databricks.
  • Solid understanding of data modeling, ETL/ELT patterns, and analytics engineering concepts.
  • Experience supporting production workloads with a focus on reliability, observability, and data quality.
  • Ability to work effectively with both technical and non-technical stakeholders in a fast-paced, cross-functional environment.

Preferred Qualifications

  • Knowledge of modern data architectures, including medallion architecture and lakehouse patterns.
  • Experience with MLOps or analytics CI/CD practices.
  • Experience migrating or refactoring pipelines from legacy platforms to cloud-native architectures.
  • Exposure to enterprise data governance, security, and compliance requirements.
  • Background in agriculture, manufacturing, or large-scale operational analytics is a plus.
 
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91134724
  • Position Id: 8941421
  • Posted 13 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

4d ago

Easy Apply

Contract

Depends on Experience

Remote

8d ago

Easy Apply

Contract

65 - 68

Remote

Yesterday

Easy Apply

Contract

Depends on Experience

Remote or Armonk, New York

Today

Contract

$45 - $53 hourly

Search all similar jobs