Overview
Skills
Job Details
Senior Databricks Engineer / Cloud Data Engineer
Location:
Hybrid/Remote (Preferred Local Candidates)
Employment Type:
Long-Term Contract 12+ months
We're looking for an experienced Databricks Engineer / Cloud Data Engineer to help build and maintain scalable data solutions supporting cybersecurity, compliance, and enterprise analytics initiatives.
This role is hands-on, focused on data pipeline development, cloud migration, Databricks platform engineering, and DevOps automation. You'll work closely with data architects, analysts, and cybersecurity teams to develop efficient, secure, and reliable data flows.
Key Responsibilities-
Design, build, and optimize data pipelines using Databricks, Apache Spark, and Delta Lake.
-
Develop and maintain ETL workflows for structured and semi-structured data.
-
Build cloud-native data platforms using Azure Data Factory, Data Lake Gen2, Functions, and SQL Databases.
-
Configure and manage Databricks clusters, Unity Catalog, and integration with Azure services.
-
Partner with data governance teams to enforce data quality and compliance rules (Collibra experience a plus). - Huge plus
-
Support on-prem to cloud migration and implement data modernization strategies.
-
Collaborate with platform, DevOps, and application teams to ensure secure deployments and efficient CI/CD pipelines.
-
Participate in code reviews, performance tuning, and production support activities.
-
10+ years of experience in data engineering or Databricks development.
-
Hands-on with Databricks, Apache Spark, and Delta Lake.
-
Strong in Python and SQL for data processing and automation.
-
Experience with Azure cloud services (ADF, Data Lake, Functions, Databases).
-
Familiar with DevOps tools (Git, Jenkins, Azure DevOps) and Agile workflow tools (Jira, Confluence).
-
Good understanding of ETL/ELT concepts, data modeling, and performance optimization.
-
Strong collaboration and problem-solving skills in multi-team environments.
-
Familiarity with Collibra or similar data governance tools.
-
Experience in financial services, cybersecurity, or compliance-driven data projects.=
-
Exposure to REST APIs, microservices, and real-time data processing.
-
Knowledge of AWS or Google Cloud Platform cloud data tools (secondary).
-
Bachelor's or Master's in Computer Science, Information Systems, or related field.
Follow us over Linkedin -