ETL Data stage with Azure Databricks

Overview

On Site
$50 - $60
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 1 Year(s)

Skills

Communication
Continuous Delivery
Continuous Improvement
Continuous Integration
Data Engineering
Agile
Attention To Detail
Batch Processing
Cloud Computing
Collaboration
Management
Microsoft Azure
NoSQL
Orchestration
Organizational Skills
IBM InfoSphere DataStage
IBM Tivoli Workload Scheduler
Leadership
Mentorship
Database
Databricks
DevOps
Extract
Transform
Load
FOCUS
Performance Metrics
Data Integration
Data Lake
Data Processing
Data Storage
Data Warehouse
ITIL
Problem Management
Product Support
Python
Real-time
Reporting
SQL
Scala
Scalability
Scheduling
Scripting
Scrum
Storage
TWS
Workflow

Job Details

We are seeking an experienced and detail-oriented Product Support Lead to join our Data Warehouse Operations team. This role will focus on maintaining, monitoring, and troubleshooting data pipelines and batch processes within a complex architecture that includes Azure, Databricks, and TWS (Tivoli Workload Scheduler). You will be responsible for ensuring the smooth operation of critical data workflows, addressing issues in real-time, and collaborating with cross-functional teams to improve our data infrastructure.

Key Responsibilities:
Monitor and ensure the health and stability of data pipelines and batch jobs running across Azure, Databricks, and TWS.
Proactively identify, investigate, and resolve data pipeline failures or performance bottlenecks.
Lead the triage and troubleshooting of production issues, working closely with engineering and operations teams.
Collaborate with data engineers to optimize the performance and scalability of data workflows.
Ensure adherence to SLAs and report on pipeline performance metrics.
Maintain and update operational runbooks, documenting key workflows and troubleshooting steps.
Manage the lifecycle of TWS job schedules, ensuring seamless batch processing and on-time execution.
Provide leadership and mentorship to junior support team members.
Develop and implement continuous improvement strategies for data operations.

Qualifications:
Bachelor s degree in Computer Science, Information Technology, or a related field.
15+ years of experience in a data operations or product support role, preferably in data warehouses or cloud-based environments.
Strong experience with Azure cloud services, including data storage (e.g., Azure Data Lake, Blob Storage) and compute (e.g., Azure Databricks).
Hands-on experience with Databricks for data engineering, ETL processing, and pipeline orchestration.
Deep understanding of TWS (Tivoli Workload Scheduler) for batch processing and scheduling.
Strong troubleshooting skills, with the ability to quickly diagnose and resolve production issues.
Knowledge of database technologies (SQL, NoSQL) and data integration tools.
Excellent communication skills, with the ability to work collaboratively across technical and business teams.
Strong organizational skills and the ability to manage multiple priorities in a fast-paced environment.

Preferred Qualifications:
Experience with Python or Scala for data processing and scripting.
Familiarity with CI/CD pipelines and DevOps practices in data environments.
Knowledge of ITIL processes and best practices for incident and problem management.
Experience working in an Agile/Scrum environment.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.