Hadoop Administrator

SQL, Linux, Performance, Java, API
Contract W2, Contract Corp-To-Corp, C2H Corp-To-Corp, C2H Independent, C2H W2, Contract Independent
Work from home not available Travel not required

Job Description

Terrific Long-Term Contract Opportunity with a FULL suite of benefits!
As one of the largest financial institutions in the world, our client has been around for over 150 years and is continuously innovating in todays digital age. If you want to work for a company thats not only a household name, but also truly cares about satisfying customers financial needs and helping people succeed financially, apply today.

Position:Hadoop Administrator
Location: Charlotte NC
Term: 12 months

Day-to-Day Responsibilities:
  • This role will be working in an Agile Scrum environment with the support of an assigned SCRUM Master, dedicated product management and a team of quality engineers to develop new capabilities, enhancements, and provide ongoing maintenance.
  • Technology platforms include MS SQL Server and Hadoop (Map R & Hortonworks).
  • Configuration and setup in AWS or on premise. Integration with Hadoop Ecosystem components and various SQL engines like Impala, Hive, Spark etc.
Hadoop Platform (Map R & Hortonworks)
  • Last one or more projects have to be on MapR or Horton Works installation, configuration and setup in AWS or on premise. Integration with Hadoop Ecosystem components and various SQL engines like Impala, Hive, Spark etc.,
  • Automate installing Hadoop in Linux environment.
  • Automate deployment in a Hadoop cluster and its maintenance.
  • Automate report on Health check of a Hadoop cluster monitoring whether it is up and running all the time.
  • Automate monitoring the storage data volume and allocation of the space in HDFS.
  • Automate resource management in a cluster environment. This involves new node creation and removal of unused ones.
  • Automate configuring NameNode to ensure its high availability
  • Automate implementing and administering Hadoop infrastructure
  • Automate required hardware and software deployment in Hadoop environment. Furthermore to expanding of existing environments.
  • Automate user creation in Linux for Hadoop and its components in the ecosystem. Moreover, setting up Kerberos principals is a part of Hadoop blade provision.
  • Performance tuning and running jobs in Hadoop clusters.
  • Capacity planning
  • Monitoring connectivity and security of Hadoop cluster
  • Automate managing and reviewing log files in Hadoop.
  • Automate backup and recovery tasks.
Is this a good fit? (Requirements):
  • Expertise in at least one high-level programming language (Preferred Java or Scala
  • Skills for developing, deploying & debugging cloud applications
  • Skills in API usage, command line interface and SDKs for writing applications
  • Knowledge of key features of Cloud Service Providers
  • Understanding of application lifecycle management
  • Ability to use continuous integration and distribution pipelines to deploy applications
  • Ability to code to implement essential security measures
  • Skills in writing, correcting and debugging code modules
  • Understanding in the use of containers in development processes

Dice Id : matrixga
Position Id : 169042
Originally Posted : 3 months ago
Have a Job? Post it

Similar Positions

Senior Hadoop Developer
  • Euclid Innovations
  • Charlotte, NC
Big Data Hadoop Developer/Lead/SME
  • Pyramid Consulting, Inc.
  • Charlotte, NC
Sr. Hadoop Technical Consultant
  • Charlotte, NC
Big Data Developer (Urgent Requirement)
  • RHP Soft Inc.
  • Charlotte, NC
Lead Hadoop Consultant
  • U.S. Tech Solutions Inc.
  • Charlotte, NC
Hadoop Developer - Charlotte,NC
  • Digital Technology Solutions
  • Charlotte, NC
Senior Hadoop Developer
  • Bank Of America
  • Charlotte, NC
BigData/Hadoop Technical Lead
  • Futuralis Tech
  • Charlotte, NC
Lead Hadoop Developer
  • Eliassen Group
  • Charlotte, NC
Hadoop Developer
  • Charlotte, NC