Azure Databricks Engineer

  • Posted 7 hours ago | Updated 7 hours ago

Overview

Remote
$80 - $90
Contract - W2

Skills

Continuous Delivery
Continuous Integration
Data Governance
Data Lake
Data Management
Change Management
Cloud Computing
Collaboration
Data Engineering
ARM
Access Control
Analytics
Apache Kafka
Artificial Intelligence
Real-time
Regulatory Compliance
Provisioning
PySpark
Microsoft Azure
Microsoft Power BI
Machine Learning (ML)
Management
Epic
Extract
Transform
Load
Orchestration
Python
Qualtrics
Git
GitHub
HL7
Meta-data Management
Network
RBAC
Data Modeling
Reporting
Requirements Elicitation
Research
SQL
Databricks
DevOps
Documentation
ELT
Extraction
RTL
Workflow
Streaming
Terraform
Unity
Version Control
Workday

Job Details

Job Description

Azure / Databricks Engineer and Data Operations Engineer

IV. Core Responsibilities

1. Requirements Gathering / Documentation / Story Creation
The Azure/Databricks Engineer and Data Operations Engineer will be responsible for designing, implementing, and managing the enterprise data lake environments on Azure, utilizing Databricks and other cloud technologies.

This role will be responsible for building ingestion solutions, collaborating on Data Ops processes, developing solutions, managing security access processes, and ensuring compliance with auditability and FinOps requirements. The ideal candidate will have a strong background in cloud technologies, particularly Azure Databricks, and a passion for driving automation and efficiency in data management processes.

You will also design, develop, and maintain data pipelines that facilitate the extraction, transformation, and loading (ETL) of data from various sources into our cloud data ecosystem. Your expertise in Azure services and data engineering best practices will be crucial in ensuring the reliability and efficiency of our data operations.

2. Stakeholder Alignment
Yes

3. Backlog and Change Management

  • Backlog: Tracking, following, and defining DevOps standards where we will groom and manage a backlog.

  • Change Management: No


V. Candidate Profile

Years of Experience: 2+ years preferred

Target Backgrounds / Industries:

  • Consulting The ideal candidate will have a strong background in cloud technologies, particularly Azure Databricks, and a passion for driving automation and efficiency in data management processes.

Top 3 5 Skills Required (that would stand out on a resume):

  • Azure Platform Core: Azure Databricks, Data Factory, Synapse, ADLS Gen2, and Key Vault for unified data engineering and analytics.

  • Infrastructure-as-Code (IaC): Terraform and Bicep for automated, consistent environment provisioning and configuration.

  • Programming & Orchestration: PySpark, SQL, and Python for pipeline development; Git-based version control for collaboration.

  • DevOps Automation: Azure DevOps or GitHub Actions for CI/CD pipelines and automated Databricks deployments.

  • Governance & Security: Unity Catalog, Collibra, Azure IAM/RBAC, and network isolation with Private Link and VNets.

  • Data Streaming & Delivery: Kafka or Event Hub for real-time ingestion; Power BI and Fabric for analytics consumption.

  • AI/ML Enablement: MLflow and Feature Store for model tracking, deployment, and reproducibility.

Preferred Skills / Nice-to-Haves:

  • Solid understanding of data modeling, ETL/ELT, and pipeline orchestration for both batch and streaming workloads.

  • Proficient with data governance concepts (lineage, metadata, access control, FinOps).

  • Able to automate infrastructure and workflows using modern DevOps practices.

  • Familiar with machine learning enablement, feature store patterns, and analytics data delivery.

Certifications / Degrees Required:

  • Bachelor s Degree Required

Systems or Tools Used:

  • Epic, WorkDay, Strata, Qualtrics, Imaging systems, Research applications, RTL RTLS, DAXs, HL7, FHIR

  • Terraform, Azure Resource Manager (ARM) templates, or Bicep

Reporting / Documentation Systems:

  • Develop and optimize ETL processes to ensure high-quality data is available for analysis and reporting.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.