Tech Lead with DataBricks

Overview

Remote
$120,000 - $140,000
Full Time

Skills

DataBricks
AI/ML
Data Warehouse
Azure
AKS
PySpark
SQL

Job Details

 

Introduction

Join an amazing company where you can work with cutting-edge technologies and platforms. Give your career an Infinite edge, with a stimulating environment and a global work culture. Be a part of an organization where we celebrate integrity, innovation, collaboration, teamwork, and passion. A culture where every employee is a leader delivering ideas that make a difference to this world we live in.

 

In the DataBricks Tech Lead responsibilities include, although not limited to:

- Lead the end-to-end architecture, design, and implementation of scalable data, analytics, and AI solutions using the Databricks Lakehouse Platform.

- Act as the primary Databricks and AI subject-matter expert, defining technical standards across data engineering, analytics, and AI/ML workloads.

- Architect and govern Analytics and Data Warehouse solutions using Databricks Lakehouse (Delta Lake).

- Architect and implement an enterprisegrade AI/ML Model Store leveraging Databricks Model Registry, MLflow, Unity Catalog, and open model formats

- Define and operationalize a portable model lifecycle, ensuring AI/ML models can seamlessly move across environments (Databricks, Azure AKS, serverless endpoints, and edge systems).

- Establish enterprise AI/ML governance frameworks that integrate policy enforcement, lineage, auditability, model approval workflows, and risk controls directly within Databricks.

- Design and lead implementation of data and AI governance controls using Unity Catalog, including data classification, permissions models, lineage, entitlements, tags, and encryption policies.

- Champion adoption of open standards for AI workloads (Delta Lake, Parquet, MLflow, ONNX, Apache Arrow, PyFunc, HuggingFace ecosystem) to ensure interoperability and vendor-neutral portability.

- Define standards for feature store governance, ensuring consistency, reusability, and cross-domain feature sharing while enforcing data quality and access policies.

- Architect model operationalization pipelines including training, retraining, evaluation, drift detection, promotion, and rollback across multiple environments.

- Implement cross-cloud and hybrid model portability, ensuring models can be deployed on Databricks Model Serving, Azure AKS, API Gateways, or customer environments with minimal friction.

- Integrate AI solutions with enterprise metadata systems and catalogs to ensure end-to-end lineage from raw data → features → models → predictions.

- Design and implement ETL/ELT pipelines using PySpark, Spark SQL, Databricks Workflows, and Delta Live Tables.

- Lead AI and Agentic AI solutions using AI Agentic frameworks, AI Agents, and Databricks AI Agent Builder.

- Define and implement MLOps and LLMOps architectures using MLflow.

- Design and operate model serving architectures using Azure serverless AKS.

- Implement AI observability, monitoring performance, drift, latency, and cost.

- Integrate Databricks with enterprise APIs and microservices, including Apigee-based API Gateways.

- Define and support Databricks Disaster Recovery (DR) strategies.

- Establish standards for CI/CD, DevOps, DataOps, and MLOps.

- Mentor engineers and communicate architectural decisions to leadership.

 

In addition to the qualifications listed below, the ideal candidate will demonstrate the following traits:

- Strong technical leadership and architectural thinking.

- Ownership-driven mindset for production-grade systems.

- Ability to translate advanced AI concepts into enterprise solutions.

- Clear communication with technical and executive audiences.

 

 

 

Minimum Qualifications:

- Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related field.

- 10+ years of overall professional experience.

- 3+ years of experience as a Tech Lead or equivalent role.

- 5+ years of hands-on experience with Databricks.

- Strong experience with PySpark, Spark SQL, and Python or Scala.

- Experience with AI Agentic frameworks, LLMOps, MLflow, and AI Gateway.

- Experience with model serving on Azure serverless AKS.

- Experience integrating APIs and microservices with Databricks.

- Knowledge of AI observability and Databricks DR.

- Strong experience with CI/CD and cloud environments (Azure preferred).

- Strong English communication skills.

 

Preferred Qualifications:

- Experience with Databricks Model Serving and Feature Store.

- Experience with Kafka and event-driven architectures.

- Experience integrating Databricks with Snowflake.

- Knowledge of Responsible AI frameworks.

- Databricks or Azure certifications.

 

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.