Databricks Lead

Detroit, MI, US • Posted 18 hours ago • Updated 18 hours ago
Contract W2
No Travel Required
Able to Sponsor
On-site
60+
Company Branding Image
Fitment

Dice Job Match Score™

🫥 Flibbertigibetting...

Job Details

Skills

  • PySpark
  • Spark SQL
  • framework development
  • Databricks Workflows
  • Oracle‑to‑Databricks
  • Delta Lake

Summary

Position Title: Databricks Lead (Data Migration Lead)

Location : Detroit MI – Onsite

Duration : 24+ Months

 

Job Overview

We are looking for a highly senior, deeply hands on Databricks Lead to lead a large‑scale Oracle‑to‑Databricks migration, covering schema migration, code conversion, and ODI job modernization. The ideal candidate has extensive experience building enterprise-grade data platforms on Databricks, has executed at least one greenfield Databricks implementation, and is exceptionally strong in PySpark, Spark SQL, framework development, and Databricks Workflows.

 

Key Responsibilities

  • Architect, design, and implement cloud-native data platforms using Databricks (ingestion → transformation → consumption).
  • Lead the full Oracle → Databricks migration including schema translation, ETL/ELT logic modernization, and ODI job replacement.
  • Develop reusable PySpark frameworks, data processing patterns, and orchestration using Databricks Workflows.
  • Build scalable, secure, and cost‑optimized Databricks infrastructure and data pipelines.
  • Collaborate with business and technical stakeholders to drive data modernization strategy.
  • Establish development best practices, coding standards, CI/CD, and DevOps/DataOps patterns.
  • Provide technical mentorship and create training plans for engineering teams.
  • Contribute to building MLOps and advanced operations frameworks.

 

Required Qualifications

  • 14+ years in Data Engineering/Architecture with at least 4+ years hands-on Databricks experience delivering end‑to‑end cloud data solutions.
  • Strong experience migrating from Oracle/on‑prem systems to Databricks, including SQL, PL/SQL, ETL logic, and ODI pipelines.
  • Deep hands-on expertise in:
    • PySpark, Spark SQL, Delta Lake, Unity Catalog
    • Building reusable data frameworks
    • Designing high‑performance batch and streaming pipelines
  • Proven experience with greenfield Databricks implementations.
  • Strong understanding of cloud-native architectures on AWS and modern data platform concepts.
  • Solid knowledge of data warehousing, columnar databases, and performance optimization.
  • Good understanding of Agile/Scrum development processes.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91156932
  • Position Id: 25686
  • Posted 18 hours ago

Company Info

About Quantom Tech LLC

Quantom Tech, a dynamic force in IT consultancy, is reshaping the industry with innovative training and placements. Fast-growing and forward-thinking, we redefine success in the ever-evolving tech landscape.

At Quantum Tech, we believe in empowering the workforce of the future. Our training programs are meticulously designed to equip professionals with the skills needed to navigate the evolving IT landscape.

We're committed to cultivating top talent in the IT industry. Our rigorous training ensures our candidates possess cutting-edge skills, empowering them to thrive in the ever-evolving tech scene.

Whether you are a seasoned professional or a fresh graduate,
we offer a range of opportunities in different roles and domains.

Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

It looks like there aren't any Similar Jobs for this job yet.

Search all similar jobs