Overview
Remote
32 - 33
Full Time
No Travel Required
Unable to Provide Sponsorship
Skills
ADF
Auditing
BMC Control-M
Cloud Computing
Conflict Resolution
Continuous Delivery
Data Link Layer
Data Loading
Data Quality
Databricks
Functional Requirements
Continuous Integration
Data Engineering
Data Governance
Network Layer
Optimization
Performance Tuning
Problem Solving
Data Lake
Issue Resolution
Job Scheduling
Level Design
Microsoft Azure
Production Support
PySpark
RBAC
Real-time
Regulatory Compliance
Release Management
SQL
Technical Drafting
Technical Writing
Workflow
Job Details
This role is responsible for the design, development, production support, and operational stability of an enterprise Databricks-based data lake. The engineer will work closely with client platform teams, infrastructure teams, and cross-vendor product teams to ensure reliable, scalable, and secure data operations.
This is a hands-on engineering role with strong ownership of production support and platform operations.
Key Responsibilities
Data Engineering & Platform
- Design, develop, and support scalable data pipelines using Azure Data Factory (ADF) and Databricks (PySpark & SQL)
- Implement and operate Medallion architecture (Bronze, Silver, Gold layers)
- Build restartable, fault-tolerant, and scalable batch data pipelines
- Integrate with accelerator or connector-based ingestion frameworks
- Monitor, troubleshoot, and resolve ADF and Databricks job failures
- Support production data pipelines with defined SLAs
- Implement logging, reconciliation, and data quality checks
- Perform cost optimization and performance tuning in Databricks
- Use job scheduling tools such as Control-M or equivalent
- Support post go-live stabilization and BAU (Business-As-Usual) operations
- Define and operate within L2/L3 support models
- Implement CI/CD pipelines for data engineering workflows
Security, Access & Governance
- Apply RBAC, service principals, and managed identities
- Operate within enterprise data governance frameworks
- Implement secure data access patterns across cloud platforms
- Support access reviews, audits, and compliance checks
Delivery & Collaboration
- Work directly with business stakeholders and technology teams
- Lead technical discussions, issue resolution, and design clarifications
- Coordinate with platform product teams on enhancements and backlog items
- Translate functional requirements into technical design and implementation plans
- Create and maintain technical documentation, including:
- Low-Level Design documents (LLDs)
- Runbooks
- Operational playbooks
Onshore Expectations
- Onshore presence to ensure real-time production support
- Act as the primary escalation point during business hours
- Own daily platform health checks and data load validations
- Support release planning, cutover activities, and post-deployment validation
Required Experience & Skills
- 3–4+ years of experience in Data Engineering or Platform Engineering
- Strong hands-on experience with:
- Azure Data Factory
- Azure Databricks (PySpark & SQL)
- Experience working in large, multi-vendor enterprise environments
- Strong troubleshooting and problem-solving skills in distributed systems
- Ability to work in a fast-paced delivery model
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.