Overview
Skills
Job Details
Position: Technical Service Delivery Manager
Location: Onsite
Experience: 12+ Years
Skills: Hadoop, BigQuery, AI/ML, Cloud Platforms, Team Management (Onshore/Offshore)
We are seeking a highly skilled Technical Service Delivery Manager with strong hands-on experience in Hadoop, Google BigQuery, and knowledge of AI/ML technologies. This role requires a seasoned leader capable of managing large-scale data platforms, ensuring seamless service delivery, and coordinating with onshore and offshore teams to meet client expectations and business goals.
Key Responsibilities
Service Delivery & Operations
Oversee end-to-end technical service delivery for data platforms built on Hadoop, BigQuery, and cloud ecosystems.
Ensure high availability, scalability, and performance of data pipelines and data warehouse environments.
Monitor SLAs, KPIs, service metrics, and proactively drive issue resolution.
Manage change, incident, and problem management processes following ITIL best practices.
Technical Leadership
Provide hands-on technical guidance in Hadoop ecosystem components (HDFS, Hive, Spark, Yarn, Oozie, Kafka).
Lead optimization of BigQuery environments including performance tuning, cost optimization, and data modeling.
Work closely with engineering teams to integrate AI/ML models into data pipelines and operational platforms.
Drive automation and efficiency improvements using modern DevOps and cloud-native tools.
Team & Stakeholder Management
Lead and mentor both onshore and offshore teams, including data engineers, analysts, and support engineers.
Coordinate resource allocation, workload distribution, and shift planning for 24/7 or follow-the-sun operations.
Collaborate with cross-functional teams architecture, product, QA, and cloud operations to ensure seamless project execution.
Maintain strong customer relationships and act as the primary point of escalation for clients.
Project & Delivery Management
Drive project planning, delivery roadmaps, and release management activities.
Conduct risk assessments and implement mitigation strategies.
Prepare and present weekly/monthly status reports to stakeholders and senior management.
Ensure adherence to compliance, governance, and security standards.
Required Skills & Qualifications
Bachelor s or Master s degree in Computer Science, Engineering, or related field.
12+ years of experience in data engineering, big data technologies, or service delivery roles.
Strong hands-on expertise in the Hadoop ecosystem (Hive, Spark, HDFS, Kafka, Oozie).
Practical experience with Google BigQuery, including SQL, optimization, and cost management.
Working knowledge of AI/ML concepts, model integration, and cloud-based ML tools.
Proven experience managing onshore and offshore teams in a delivery environment.
Strong analytical, problem-solving, and communication skills.
Experience with cloud platforms (Google Cloud Platform, AWS, or Azure) and DevOps tools preferred.
ITIL certification is a plus.