Data Engineering Tech Lead (Databricks Migration)

Remote • Posted 2 hours ago • Updated 2 hours ago
Contract Corp To Corp
Contract W2
Contract Independent
No Travel Required
Remote
Depends on Experience
Fitment

Dice Job Match Score™

⭐ Evaluating experience...

Job Details

Skills

  • Data Engineering Tech Lead
  • Data Engineer
  • Databricks
  • AWS Glue
  • Redshift

Summary

Job Title: Data Engineering Tech Lead (Databricks Migration)

Location: Remote
Employment Type: Contract / Full-Time


Job Description:

We are seeking an experienced Data Engineering Tech Lead to drive a large-scale migration from AWS Glue and Redshift to Databricks Lakehouse architecture. This role will own the end-to-end migration lifecycle, define technical standards, and lead the transformation of legacy data platforms into scalable, modern data solutions.


Key Responsibilities:

🔹 Migration Leadership

  • Lead migration from AWS Glue/Redshift to Databricks
  • Own full lifecycle: discovery, design, build, testing, cutover, and decommissioning
  • Inventory and prioritize pipelines, stored procedures, and workloads
  • Define wave-based migration strategy with success criteria
  • Ensure data parity, reconciliation, and regression testing

🔹 Architecture & Engineering

  • Design Databricks Lakehouse architecture (Unity Catalog, medallion layers, S3 storage)
  • Define cluster strategies, SQL Warehouses, and cost optimization models
  • Re-platform pipelines using PySpark, Spark SQL, Delta Live Tables, Databricks Workflows
  • Support downstream integrations (BI, ML, applications)

🔹 Standards & Governance

  • Establish CI/CD pipelines (GitHub Actions, Azure DevOps, Terraform)
  • Define coding standards, testing frameworks, and deployment practices
  • Implement data quality, observability, and lineage frameworks
  • Manage security and governance via Unity Catalog (RBAC, audit, compliance)

🔹 Optimization & Performance

  • Optimize Spark performance (partitioning, shuffles, caching, file sizing)
  • Manage cluster sizing, autoscaling, and cost optimization strategies
  • Ensure efficient use of compute (Photon, job clusters vs all-purpose clusters)

Required Skills:

  • 8+ years of data engineering experience
  • 2+ years in a Tech Lead / Lead Engineer role
  • Strong hands-on experience with:
    • Databricks (Delta Lake, Unity Catalog, DLT, Workflows)
    • Apache Spark (PySpark, Spark SQL)
    • AWS Glue & Amazon Redshift
  • Expertise in SQL, Python, and CI/CD pipelines
  • Strong experience with AWS services (S3, IAM, KMS, VPC, Lambda, Kinesis, CloudWatch)
  • Experience leading end-to-end data platform migrations
  • Strong understanding of data modeling (dimensional, Data Vault, medallion)

Preferred Skills:

  • Experience with Terraform / Databricks Asset Bundles
  • Exposure to data observability tools (Great Expectations, etc.)
  • Strong documentation and stakeholder communication skills

eye

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10204540
  • Position Id: 85183-2308-
  • Posted 2 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

6d ago

Easy Apply

Contract

Depends on Experience

Remote

15d ago

Easy Apply

Contract

Depends on Experience

Remote

Today

Easy Apply

Contract, Third Party

Depends on Experience

Remote

8d ago

Easy Apply

Third Party, Contract

90 - 95

Search all similar jobs