Azure Databricks Architect -New York, NY

Overview

On Site
Depends on Experience
Contract - W2
Contract - 12 Month(s)

Skills

Azure
Databricks
architect
Python
SQL
PySpark
Google Cloud Platform
DevOps
Amazon Web Services
Apache Spark
Continuous Delivery
Git
IT Architecture
Microsoft Azure

Job Details

The following requirement is open with our client.

Client : TCS

Title : Azure Databricks Architect

Location : New York, NY

Duration : 12Months

Detailed Job Description:

We are looking for an experienced Databricks Architect with deep expertise in Lakehouse and Medallion Architecture (Bronze, Silver, Gold layers) to lead the design and implementation of scalable, high-performance data solutions on the Databricks platform.

The ideal candidate will have a strong background in big data architecture, data governance, and cloud-native data engineering using technologies like Delta Lake, Spark, and Apache Airflow.

Design, architect, and implement end-to-end data pipelines using Databricks Lakehouse and Medallion architecture (Bronze, Silver, Gold layers).

Lead development teams in building scalable and reusable frameworks for ingestion, transformation, and analytics.

Define data modeling strategies and partitioning schemes to optimize performance and cost.

Ensure data governance, quality, and security standards are incorporated throughout the architecture.

Work closely with business stakeholders, data scientists, and analysts to align technical architecture with business objectives.

Set up and manage CI/CD pipelines, orchestrators (e.g., Airflow, Databricks Workflows), and Delta Live Tables.

Act as a subject matter expert for Databricks, Delta Lake, Unity Catalog, and cloud-native data platforms (e.g., AWS, Azure, Google Cloud Platform).

Conduct design reviews, code reviews, and technical workshops to mentor engineering teams. Strong experience with the Medallion architecture (Bronze/Silver/Gold) data layer design pattern.

Proficiency in Python, SQL, and PySpark.

Experience with cloud platforms: AWS (preferred), Azure, or Google Cloud Platform.

Hands-on experience with Delta Live Tables, Unity Catalog, and Lakehouse Federation is a plus.

Knowledge of orchestration tools like Apache Airflow, Databricks Workflows, or Azure Data Factory.

Familiarity with CI/CD tools (Git, Azure DevOps, Jenkins, etc.).

Solid understanding of data security, RBAC/ABAC, and data governance practices.

Strong analytical and problem-solving skills with attention to detail.

Databricks Certified Data Engineer Professional, Lakehouse Fundamentals, or Solutions Architect.

Prior experience in banking/finance domains is a plus.

Must Have Skills:

Azure or Google Cloud Platform

Python

Databricks

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.