Databricks Developer / Tech Lead

Overview

On Site
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

Databricks
SQL
Azure
ETL
Delta Lake
governance
PySpark
Scala
Azure Data Factory

Job Details

Job Title: Databricks Developer / Tech Lead

Location: Tampa, FL (On-Site)

Job Description:

About the Role:

We are looking for a skilled Data Engineer with strong experience in Databricks SQL and Azure Delta Lake to join our data team. The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines and optimizing data workflows in a cloud environment. Experience with ETL tools like Informatica is a plus.

Key Responsibilities:

Develop, test, and maintain data pipelines and workflows using Databricks and Azure Delta Lake.

Design and implement data models, tables, and views to support analytics and business intelligence use cases.

Optimize data ingestion and transformation processes for performance and cost-effectiveness.

Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality solutions.

Ensure data quality, governance, and security best practices are followed.

Troubleshoot data issues and provide timely support to data consumers.

Assist in migrating and integrating data from on-premises and other cloud sources into Azure Delta Lake.

(Good to have) Develop and maintain ETL processes using Informatica to support data integration requirements.

Required Skills & Qualifications:

Bachelor s degree in Computer Science, Information Technology, Engineering, or related field.

Proven experience working with Databricks Notebooks.

Strong knowledge of Spark (PySpark/Scala) within Databricks environments.

Understanding of Azure Data Factory or other orchestration frameworks.

Strong knowledge of Azure Delta Lake architecture, workbook usage, and best practices.

Hands-on experience with cloud data platforms, preferably Azure.

Proficient in writing complex SQL queries and data transformation logic.

Familiarity with data warehousing concepts and cloud-based lakehouse architecture.

Strong problem-solving and communication skills.

Experience with version control and collaboration tools like Git.

Good to Have:

Experience with ETL tools, especially Informatica, for data integration and workflow orchestration.

Experience with CI/CD pipelines for data projects.

Familiarity with data governance, metadata management, and data catalog tools.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.