BigData administrator

  • TalTeam,
  • Washington, DC
Hadoop technical stack (e.g. MapReduce, Yarn, Hive, HDFS, Oozie), Kudu, Spark, Kafka, Avro, NoSQL stores (e.g. Hbase), Hadoop Distribution, Cloudera, Horton Works, MapR or IBM Big Insights
Full Time, Contract Corp-To-Corp, Contract Independent, Contract W2, C2H Corp-To-Corp, C2H Independent, C2H W2, Part Time, 12+ Months
Depends On Experience
Telecommuting not available Travel not required

Job Description

Position: BigData administration  (Direct client_In-Person is must)
Location: Washington DC
Duration: 12+ Months  

Job description: 

  • Provide support for successful installation and configuration of new Big Data clusters across various environments
  • Provide support for successful expansion of Big Data clusters across various environments
  • Provide day to day Big Data administration support
  • Work with Big Data team, Infrastructure Architect, Change Management team to support configuration, code migrations of Big Data Deliverables
  • Looks to leverage reusable code modules to solve problems across the team, including Data Preparation and Transformation and Data export and synchronization
  • Act as a Big Data administration liaison with Infrastructure, Security, Application Development , Project Management
  • Keep current on latest Big Data technologies and products, including hands-on evaluations and in-depth research
  • Works with Big Data lead/architect to perform detailed planning, risk/issue management
Required Qualifications
  • 5+ years of administrator experience working with batch processing and tools in the Hadoop technical stack (e.g. MapReduce, Yarn, Hive, HDFS, Oozie)
  • 5+ years of administrator experience working with tools in the stream processing technical stack (e.g. Kudu, Spark, Kafka, Avro)
  • Administrator experience with NoSQL stores (e.g. Hbase)
  • Expert scripting skills
  • Expert knowledge on Active Directory/LDAP security integration with Big Data
  • Hands-on experience with at least one major Hadoop Distribution such as Cloudera, Horton Works, MapR or IBM Big Insights (preferably Cloudera)
  • Hands-on experience monitoring and reporting on Hadoop resource utilization
  • 5+ years of doing data related benchmarking, performance analysis and tuning
  • Hands-on experience supporting code deployments (Spark, Hive, Ab Initio, etc.) into the Hadoop cluster
  • 4+ years of experience with SQL and at least two major RDBMS's
  • 6+ years as a systems integrator with Linux systems and shell scripting
  • Bachelor's degree in Computer Science, Information Systems, Information Technology or related field and 8+ years of software development/DW & BI experience
  • Excellent verbal and written communication skills
Preferred Qualifications
• Hadoop administration experience has been on a Cloudera platform
• Healthcare industry experience
• Master of Science in Mathematics, Engineering, or Computer Science
• ETL solution experience, preferably on Hadoop
• Experience with industry leading Business Intelligence tools
• Near Real Time use case experience
• Solr search experience  
 
Thank you, 

Posted By

13800 Coppermine Rd, 1st Fl Herndon, VA, 20171

Contact
Dice Id : 10110436
Position Id : SK080119TT1
Have a Job? Post it