Senior BigData Hadoop Developer/Architect only 10+ IT exp

Hadoop, Bigdata, cchd, MongoDB, Python, Hbase, Hive, Core Java, Scala, SQL
Full Time, Contract Corp-To-Corp, Contract Independent, Contract W2, Long Term
Depends On Experience
Telecommuting not available Travel not required

Job Description

Qualification:

  • 2+ years working experience in USA, as this is with a federal client.
  • Hadoop/bigdata certification for Developer.
  • 10+ years of IT experience

 

Senior BigData HadoopDeveloper/Architect

Role & Responsibilities:

Design and implement Big Data analytic solutions on a Hadoop–based platform. Create custom analytic and data mining algorithms to help extract knowledge and meaning from vast stores of data. Refine a data processing pipeline focused on unstructured and semi-structured data refinement. Support quick turn and rapid implementations and larger scale and longer duration analytic capability implementations.

  • Hadoop development and implementation.
  • Loading from disparate data sets.
  • Pre-processing using Hive and Pig.
  • Designing, building, installing, configuring and supporting Hadoop.
  • Translate complex functional and technical requirements into detailed design.
  • Perform analysis of vast data stores and uncover insights.
  • Maintain security and data privacy.
  • Create scalable and high-performance web services for data tracking.
  • High-speed querying.
  • Managing and deploying HBase.
  • Propose best practices/standards.

Skills:

  • Experience with Hadoopand the HDFS Ecosystem
  • Strong Experience with Apache Spark, Storm, Kafka is must.
  • Experience with Python, R, Pig, Hive, Kafka, Knox, Tomcat and Ambari
  • Experience with MongoDb
  • A minimum of 4 years working with HBase/Hive/MRV1/MRV2 is required
  • Experience in integrating heterogeneous applications is required
  • Experience working with Systems Operation Department in resolving variety of infrastructure issues
  • Experience with Core Java, Scala, Python, R
  • Experience on Relational Data Base Systems(SQL) and Hierarchical data management
  • Experience with MapReduce
  • Experience to ETL tools such as Sqoop and Pig
  • Data-modeling and implementation
  • Experience in working with market / streaming data and time-series analytics
  • Experience on working with different caching strategies
  • Experience on working with multiple solutions for data movements such as – file copy, pub-sub, ftp, etc
  • Development of web-based and digital framework for content delivery
  • Experience with batch processing
  • Experience working with Hortonworks or Cloudera (preferred)
  • Data Torrent / Apex and Pentaho is a plus
  • Experience with Navigator is a plus
  • Experience with REST API is a plus
  • Experience with streaming processing is a plus
  • Exposure to encryption tools (HP Voltage) is a plus
  • Exposure to NoSQL store is a plus

Posted By

13800 Coppermine Rd Suite 138 Herndon, VA, 20171

Dice Id : 10491343
Position Id : 919376
Have a Job? Post it

Similar Positions

Hadoop Developer
  • Softcom, Inc.
  • Tysons Corner, VA
Hadoop Developer
  • OmniTech Systems
  • Washington, DC
Cloudera Hadoop Architect
  • NTT DATA, Inc.
  • Gaithersburg, MD
Big Data Cloud Developer
  • Competent Systems, Inc
  • Herndon, VA
Hadoop Developer
  • The Oakleaf Group
  • Washington, Dc, DC
Big Data Architect with healthcare/Pharma
  • V.L.S. Systems, Inc
  • Gaithersburg, MD
Big Data / Hadoop Developer
  • Precision Systems
  • Chevy Chase, MD
Hadoop Developer - Full time position
  • TechNumen, Inc
  • Washington, DC
Hadoop Developer HDF (Hortonworks DataFlow)
  • Karsun Solutions LLC
  • Herndon, VA
Java Big Data Lead Developer / Architect
  • Intelligent Capital Network, Inc.
  • Mclean, VA
Big Data Architect, Mid
  • Booz Allen Hamilton
  • Rockville, MD