Hadoop Developer

Overview

Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - W2

Skills

HDFS
Apache hadoop
Kerberos

Job Details

Cluster Management: Setting up, configuring, and maintaining Hadoop clusters, including adding and removing nodes.

  • Performance Tuning: Monitoring and tuning the performance of Hadoop clusters and MapReduce routines.
  • Data Management: Managing HDFS (Hadoop Distributed File System) and ensuring data integrity and availability.
  • Security: Implementing and managing Hadoop security, including Kerberos integration.
  • User Management: Setting up new Hadoop users, including Linux users and Kerberos principals.
  • Monitoring: Using open-source tools like Ganglia, Nagios, and Ambari to monitor cluster health and performance.
  • Backup and Recovery: Implementing high availability and disaster recovery solutions for Hadoop clusters.
  • Collaboration: Working with data delivery teams and other IT teams to ensure smooth operations and data quality.
  • Technical Skills: Proficiency in Hadoop ecosystem tools like HDFS, Yarn, Hive, Impala, Spark, Kafka, HBase, and Ambari.
  • Experience: Hands-on experience with open-source Hadoop distributions (e.g., Apache Hadoop).

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.