Overview
Skills
Job Details
Job Title: Hadoop Admin
Location: Chicago, IL
Work Arrangement: Hybrid
Client Industry: Financial Services
Duration: 12 Months Contract
Schedule: Standard Hours( Monday - Friday, 8 AM to 5 PM)
Day To Day Responsbilities:
- Work on complex, major, or highly visible tasks in support of multiple projects requiring multiple areas of expertise. - Provide subject matter expertise in managing Hadoop and Data Science Platform operations, focusing on Cloudera Hadoop, Jupyter Notebook, OpenShift, Docker-Container Cluster Management, and Administration. -
- Integrate solutions with other applications and platforms outside the framework.
- Manage platform operations across all environments, including upgrades, bug fixes, deployments, metrics/monitoring for resolution and forecasting, disaster recovery, incident/problem/capacity management.
- Serve as a liaison between client partners and vendors in coordination with project managers to provide technical solutions that address user needs
- Manage day-to-day operations for platforms built on Hadoop, Spark, Kafka, Kubernetes/OpenShift, Docker/Podman, and Jupyter Notebook.
- Support and maintain AI/ML platforms such as Cloudera, DataRobot, C3 AI, Panopticon, Talend, Trifacta, Selerity, ELK, KPMG Ignite, and others.
- Perform cluster management, upgrades, bug fixes, deployments, and disaster recovery activities.
- Own incident response, problem management, and capacity planning.
- Implement monitoring, alerting, and forecasting solutions to ensure system reliability.
- Collaborate closely with development teams, vendors, and stakeholders to support agile software delivery.
- Automate platform tasks using tools like Ansible, shell scripting, and Python.
Must Have:
- Strong knowledge of Hadoop Architecture, HDFS, Hadoop Cluster, and Hadoop Administrator's role
- Intimate knowledge of fully integrated AD/Kerberos authentication.
- Experience setting up optimum cluster configurations
- Expert-level knowledge of Cloudera Hadoop components such as HDFS, Sentry, HBase, Kafka, Impala, SOLR, Hue, Spark, Hive, YARN, Zookeeper, and Postgres.
- Hands-on experience analyzing various Hadoop log files, compression, encoding, and file formats.
- Scripting and automation skills Python, Shell, Ansible
- Experience in Monitoring Alerting, and Job Scheduling Systems.
- Knowledge of databases: SQL, Cassandra, Postgres.
- Strongly Proficiency with Unix/SQL scripting
- Experience in agile development environments.
Nice To Have:
- Previous experience with Snowflake, Data Bricks or Cloud Platforms (azure)
- Previous Financial Experience
Compensation
- Hourly Rate: $65 - $70 per hour
This reflects base compensation and may vary based on candidate qualifications.
Benefits
The Company offers the following benefits for this position, subject to applicable eligibility requirements: medical insurance, dental insurance, vision insurance, 401(k) retirement plan, life insurance, long-term disability insurance, short-term disability insurance, paid parking/public transportation, paid time off, paid sick and safe time, hours of paid vacation time, weeks of paid parental leave, and paid holidays annually as applicable.
About Us
At Collabera, we don t just offer jobs we build careers. As a global leader in talent solutions, we provide opportunities to work with top organizations, cutting-edge technologies, and dynamic teams. Our culture thrives on innovation, collaboration, and a commitment to excellence. With continuous learning, career growth, and a people-first approach, we empower you to achieve your full potential. Join us and be part of a company that values passion, integrity, and making an impact.