Senior ETL Databricks Engineer - Remote - W2 / Own corporation

Remote • Posted 5 hours ago • Updated 5 hours ago
Contract W2
Contract Independent
Able to Sponsor
Remote
Depends on Experience
Fitment

Dice Job Match Score™

⭐ Evaluating experience...

Job Details

Skills

  • ETL
  • Databricks
  • Pyspark

Summary

Senior ETL Databricks Engineer

(Prefer Locals to Atlanta, GA) or Remote

Longterm

Job Description:

Overview

We are seeking a skilled Databricks Engineer to design, build, and optimize largescale data pipelines and analytics solutions using Azure Databricks, Spark, and modern cloud data platforms. The ideal candidate has strong experience in distributed data processing, ETL/ELT development, and cloud-native engineering practices.

Key Responsibilities

  • Design, develop, and maintain ETL/ELT pipelines using Databricks (PySpark/SQL) for batch and streaming workloads.
  • Build scalable data processing solutions leveraging Spark (RDD, DataFrames, Datasets).
  • Develop and optimize Delta Lake tables, medallion architecture, and data quality frameworks.
  • Implement CI/CD pipelines for Databricks notebooks, jobs, and workflows.
  • Integrate Databricks with Azure Data Factory, ADLS Gen2, Synapse, Event Hubs, and other cloud services.
  • Configure and manage Databricks clusters, Unity Catalog, permissions, and workspace governance.
  • Collaborate with data architects, analysts, and business stakeholders to translate requirements into technical solutions.
  • Ensure data reliability, performance tuning, and cost optimization across the platform.
  • Implement security best practices including RBAC, Key Vault integration, and data encryption.
  • Troubleshoot production issues and support operational workloads.

Required Skills

  • Strong hands-on experience with Azure Databricks, PySpark, and Spark SQL.
  • Proficiency in Python, SQL, and distributed data processing.
  • Experience with Azure Data Factory, ADLS, Synapse, or similar cloud data services.
  • Knowledge of Delta Lake, ACID transactions, schema evolution, and time travel.
  • Understanding of data modeling, including dimensional modeling and SCD patterns.
  • Experience with Git, DevOps pipelines, and automated deployments.
  • Familiarity with streaming technologies (Structured Streaming, Event Hubs, Kafka).

Preferred Qualifications

  • Experience with Unity Catalog, Delta Sharing, or Databricks governance frameworks.
  • Background in Snowflake, AWS, or multi-cloud environments.
  • Exposure to MLflow, feature engineering, or machine learning pipelines.
  • Industry experience in healthcare, finance, utilities, or enterprise data platforms.

Certifications (Preferred)

  • Databricks Certified Data Engineer Associate
  • Databricks Certified Data Engineer Professional
  • Databricks Lakehouse Fundamentals
  • Microsoft Azure Data Engineer Associate (DP203)
  • Microsoft Azure Solutions Architect Expert (AZ305) (plus)
  • Azure Administrator (AZ104) (plus)
  • SnowPro Core (optional, for hybrid lakehouse/warehouse environments)
  • AWS Data Analytics Specialty

Regards,

A close up of a logoDescription automatically generated

Vinay Ram

(Direct)

Suwanee, GA - 30024

An MBE & eVerify Company

Connect with me for exciting career opportunities:

Open Jobs (For Recruiters):

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91081414
  • Position Id: 8948119
  • Posted 5 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

Today

Easy Apply

Contract

Remote

3d ago

Easy Apply

Contract

Depends on Experience

Remote

Today

Easy Apply

Contract

Depends on Experience

Remote

Today

Easy Apply

Contract, Third Party

Depends on Experience

Search all similar jobs