Must Have Technical/Functional Skills
BigData Platform Admin & Strategist
Roles & Responsibilities
Job Title: BigData Platform Admin & Strategist
Job Description:
We are looking for a highly skilled and passionate BigData Platform Admin, who acts as a crucial liaison between
the Hadoop admin team and various application development teams. The role is responsible for ensuring the optimal
performance, stability, and future readiness of the Hadoop platform, focusing on strategic oversight rather than
day-to-day administrative tasks. As a strategist will facilitate communication, drive best practice, assess technical
impacts of the platform changes, and contribute to the overall health and efficiency of the Hadoop ecosystem.
Responsibilities:
Stakeholder Unification : Serve as a single point of contact and unified stakeholder for all Hadoop-related
concerns, bridging the gap between platform administrators and application teams.
Platform Upgrade Management :
Review and assess upcoming Hadoop platform upgrades, including new features, libraries and patches.
Conduct impact analysis on existing applications and services, identifying potential risks and opportunities
Co ordinate and communicate upgrade schedules and requirements will all relevant teams.
Technical Feature and Library Evaluation :
Identify and evaluate new technical features and libraries within the Hadoop ecosystem that can benefit
application teams or improve platform efficiency.
Propose and advocate for the adoption of new technologies and methodologies to enhance the platform s
capabilities.
Cluster Health and Optimization :
Monitor overall cluster health, performance metrics, and resource utilization.
Propose and implement optimization strategies to improve cluster efficiency, scalability and
cost-effectiveness.
Collaborate with the admin team to troubleshoot and resolve complex platform-level issues.
Resource Management and Housekeeping :
Oversee and manage the allocation of cluster resources (CPU, memory, storage) across various
applications and tenants.
Establish and enforce policies for resource quota management, data lifecycle and storage optimization.
Implement housekeeping strategies to maintain a clean and efficient cluster environment.
Best Practices and Overall Excellence :
Define, document and promote best practices for Hadoop application development, deployment and
and operations.
Ensure operational stability and resiliency of the Hadoop platform, implementing measures to prevent
outages and minimize downtime.
Contribute in disaster recovery and business continuity plan for the Hadoop ecosystem.
Solution Proposal and Innovation :
Research and propose suitable technical solutions to address emerging business needs, performance
bottlenecks, or architectural challenges within the Hadoop ecosystem.
Stay abreast of the Industry trends and advancements in big data technologies, continuously seeking
opportunities for innovation.
Qualifications :
Education : Bachelor s or Master s degree in Computer Science, Engineering or a related field.
Experience :
5+ years of experience in big data environment, with a focus on Hadoop.
Proven experience in a technical leadership or architect role, working closely with both operations and
development teams.
Experience with distributed systems, data processing frameworks (e.g. Spark, Hive) and data warehousing
concepts.
Familiarity with the cloud platforms (eg. AWS, Azure, Google Cloud Platform) and containerization technologies
(eg. Dockets, Kebernetes) is a plus.