Hadoop Administrator

HDFS, Hive, Pig, Hbase/Cassandra, Flume, Zookeeper, Kerberos / Open LDAP/SAML, AD, LDAP, Linux, Cloudera, Spark, MapReduce, HDFS, Hive, HBase, and Zeppelin. Python, bash, PowerShell, Perl, Java.,
Contract W2, 6 months+
$100/hr C2C 90/hr W2
Telecommuting not available Travel not required

Job Description

*This position requires a US Citizenship

 

We are seeking a hands-on, seasoned Hadoop Administrator to work on a mission critical information sharing platform that will enable the sharing of secure, accurate, and privacy-controlled data with approved stakeholders, while protecting sensitive data and preserving privacy within Federal Agency components and other national security agencies. The system is deployed across the classified and unclassified domains within the Federal Agency, and housed at Federal Agency Data centers.

We are looking for a highly skilled IT engineer with deep experience in the deployment and administration of Hadoop. The individual will work closely with customers and development teams to create, manage, upgrade, and secure Hadoop clusters.

 

Responsibilities:

Deploy the Hadoop File System (HDFS), and peripheral technologies in its ecosystem (HDFS, Hive, Pig, Hbase/Cassandra, Flume, Zookeeper) using both automated toolsets as well as manual processes.
Maintain, support, and upgrade Hadoop clusters.
Monitor jobs, queues, and HDFS capacity using Zookeeper and vendor-specific front-end cluster management tools.
Balance, commission & decommission cluster nodes.
Apply security (Kerberos / Open LDAP/SAML) linking with Active Directory and/or LDAP.
Enable users to view job progress via web interface.
Onboarding users to use Hadoop – configuration, access control, disk quota, permissions etc.
Address all issues, apply upgrades and security patches.
Commission/de-commission nodes backup and restore.
Apply "rolling" cluster node upgrades in a Production-level environment.
Assemble newly bought hardware into racks with switches, assign IP addresses properly, firewalling, enable/disable ports, VPN etc.
Work with virtualization team to provision / manage HDP cluster components.

Requirements

Minimum 3 years of Linux/Unix administration.
Minimum 3 years of experience with (Cloudera/Hortonworks) Hadoop Administration.
Extensive experience in Hadoop ecosystem including Spark, MapReduce, HDFS, Hive,
HBase, and Zeppelin.

1 year experience with Hadoop-specific automation (e.g. blueprints).
1 year technical experience managing Hadoop cluster infrastructure environments (e.g. data
center infrastructure) that utilized at least 20 data nodes.

Experience scripting in one or more of Python, bash, PowerShell, Perl, Java.
1 year experience with Puppet and / or Chef ( good to have not mandatory )
1 year virtualization experience in any of VMware / Hyper-V / KVM. ( good to have not mandatory ) ability to thrive within a dynamic technology environment.

Preferred

Certified Hadoop Admin (Cloudera)
Red Hat certified
Networking (TCP/IP, Routers, IP addressing, use of network tools)
Analyzing data with Hive, Pig and/or HBase
Data Ingestion, streaming, or Importing/exporting RDBMS data using Sqoop
DBA experience.
RDBMS SQL Development.
Manage cluster-hardening activities through the implementation and maintenance of security and governance components across various cluster.

Education

Bachelor’s Degree in Computer Science or equivalent +10 years of IT experience.

Posted By

1600 Tysons Blvd, Suite 800 McLean, VA, 22102

Dice Id : 10113944
Position Id : 908468
Have a Job? Post it