Overview
Skills
Job Details
Overview:
We are seeking an experienced Azure Databricks Engineer to design, implement, and optimize modern Lakehouse architectures in a regulated financial services environment. This role is a blend of hands-on engineering and leadership, requiring strong technical depth alongside the ability to enforce governance and security at scale.
Responsibilities:
Architect, build, and maintain Azure-based Lakehouse solutions using Databricks, Delta Lake, and Azure-native services.
Implement medallion (bronze/silver/gold) architectures for scalability, governance, and analytics agility.
Develop and optimize data ingestion frameworks for batch and streaming pipelines (PySpark, Spark SQL).
Configure and manage Unity Catalog with row- and column-level security, role-based access, and lineage tracking.
Integrate siloed systems into unified, governed data platforms.
Ensure compliance with financial services regulations and secure data delivery.
Optimize Spark workloads for performance and cost efficiency.
Support BI/analytics teams with performant semantic layers.
Drive best practices in SDLC, CI/CD, testing, and DevOps within Azure and Databricks ecosystems.
Qualifications:
7+ years in data engineering, including 3+ years with Azure Databricks.
Strong expertise in Azure services (Data Factory, SQL, Fabric).
Hands-on experience with Unity Catalog security and governance.
Deep understanding of financial services or similarly regulated industries.
Proficient in Python, PySpark, Spark SQL, and T-SQL.
Skilled in performance tuning and cost optimization of large-scale Spark workloads.
Strong knowledge of data governance, compliance, and security.
Excellent communication skills to bridge technical and business needs.
Proven problem-solver with a delivery mindset.
Team player with mentoring and best-practice leadership experience.