Overview
On Site
Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - 6 Month(s)
No Travel Required
Skills
API
Databricks
Catalog
Python
GitHub
Jenkins
ServiceNow
DevOps
IAM
Job Details
Required Skills
Core Skills o Strong proficiency in Databricks (SQL, PySpark, Delta Lake, Jobs/Workflows). o Deep knowledge of Unity Catalog administration and APIs. o Expertise in Python for automation scripts, API integrations, and data quality checks. o Experience with governance frameworks (access control, tagging enforcement, lineage, compliance). o Solid foundation in security & compliance best practices (IAM, encryption, PII).
Additional Skills
Automation & DevOps o Experience with CI/CD and deployment pipelines (GitHub Actions, Azure DevOps, Jenkins). o Familiarity with monitoring/observability tools and building custom logging & alerting pipelines. o Experience integrating with external systems (ServiceNow, monitoring platforms). Additional Skills o Experience with modern data quality frameworks (Great Expectations, Deequ, or equivalent). o Strong problem-solving and debugging skills in distributed systems. o Clear communication and documentation skills to collaborate across GT and D&A teams.
Job Description
Senior Backend Engineer Metadata Catalog / Governance Automation Mission
The mission of the Data & Analytics (D&A) team is to enable data users to easily discover, understand, and access trusted data products. A critical enabler of this mission is robust governance and automation within Databricks and Unity Catalog. The Senior Backend Engineer will design, build, and scale automation capabilities that enforce governance standards, improve data quality, and provide transparency into metadata, lineage, and usage. This role will ensure that the Metadata Catalog UI and supporting services are powered by trusted, well-governed, and observable data infrastructure
Key Responsibilities
- Databricks & Unity Catalog Engineering
- Build and maintain backend services leveraging Databricks (SQL, PySpark, Delta Lake, Jobs/Workflows).
- Administer Unity Catalog including metadata, permissions, lineage, and tags.
- Integrate Unity Catalog APIs to surface data into the Metadata Catalog UI.
- Governance Automation
- Develop automation scripts and pipelines to enforce access controls, tagging, and role-based policies.
- Implement governance workflows integrating with tools such as ServiceNow for request and approval processes.
- Automate compliance checks for regulatory and security requirements (IAM, PII handling, encryption).
- Data Quality & Observability
- Implement data quality frameworks (Great Expectations, Deequ, or equivalent) to validate datasets.
- Build monitoring and observability pipelines for logging, usage metrics, audit trails, and alerts.
- Ensure high system reliability and proactive issue detection.
- API Development & Integration
- Design and implement APIs to integrate Databricks services with external platforms (ServiceNow, monitoring tools).
- Build reusable automation utilities and integration frameworks for governance at scale.
- DevOps & CI/CD
- Manage source control and CI/CD pipelines (GitHub, Azure DevOps, Jenkins) for backend workflows.
- Deploy scalable and secure backend services in cloud environments (Azure preferred).
- Document, test, and industrialize automation solutions for production environments.
Profile
- Core Skills
- Strong proficiency in Databricks (SQL, PySpark, Delta Lake, Jobs/Workflows).
- Deep knowledge of Unity Catalog administration and APIs.
- Expertise in Python for automation scripts, API integrations, and data quality checks.
- Experience with governance frameworks (access control, tagging enforcement, lineage, compliance).
- Solid foundation in security & compliance best practices (IAM, encryption, PII).
- Automation & DevOps
- Experience with CI/CD and deployment pipelines (GitHub Actions, Azure DevOps, Jenkins).
- Familiarity with monitoring/observability tools and building custom logging & alerting pipelines.
- Experience integrating with external systems (ServiceNow, monitoring platforms).
- Additional Skills
- Experience with modern data quality frameworks (Great Expectations, Deequ, or equivalent).
- Strong problem-solving and debugging skills in distributed systems.
- Clear communication and documentation skills to collaborate across GT and D&A teams.
Education & Experience
- Bachelor s degree in Computer Science, Engineering, or related field OR equivalent professional experience.
- 5+ years of backend engineering experience in data platforms.
- 3+ years working with Databricks and/or Unity Catalog in enterprise environments.
- Demonstrated ability to design and deliver automation solutions for governance, quality, and compliance at scale.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.