- 2+ years working experience in USA, as this is with a federal client.
- Hadoop/bigdata certification for Developer.
- 10+ years of IT experience.
- Experience with Ansible is a big plus.
- Experience with Apache Ambari is a big plus.
Position1: Senior BigData/Hadoop Developer (2 openings)
Position2: BigData/Hadoop Architect (1 opening)
Role & Responsibilities:
Design and implement Big Data analytic solutions on a Hadoop–based platform. Create custom analytic and data mining algorithms to help extract knowledge and meaning from vast stores of data. Refine a data processing pipeline focused on unstructured and semi-structured data refinement. Support quick turn and rapid implementations and larger scale and longer duration analytic capability implementations.
- Hadoop development and implementation.
- Loading from disparate data sets.
- Pre-processing using Hive and Pig.
- Designing, building, installing, configuring and supporting Hadoop.
- Translate complex functional and technical requirements into detailed design.
- Perform analysis of vast data stores and uncover insights.
- Maintain security and data privacy.
- Create scalable and high-performance web services for data tracking.
- High-speed querying.
- Managing and deploying HBase.
- Propose best practices/standards.
- Experience with Hadoopand the HDFS Ecosystem
- Strong Experience with Apache Spark, Storm, Kafka is must.
- Experience with Python, R, Pig, Hive, Kafka, Knox, Tomcat and Ambari
- Experience with MongoDb
- A minimum of 4 years working with HBase/Hive/MRV1/MRV2 is required
- Experience in integrating heterogeneous applications is required
- Experience working with Systems Operation Department in resolving variety of infrastructure issues
- Experience with Core Java, Scala, Python, R
- Experience on Relational Data Base Systems(SQL) and Hierarchical data management
- Experience with MapReduce
- Experience to ETL tools such as Sqoop and Pig
- Data-modeling and implementation
- Experience in working with market / streaming data and time-series analytics
- Experience on working with different caching strategies
- Experience on working with multiple solutions for data movements such as – file copy, pub-sub, ftp, etc
- Development of web-based and digital framework for content delivery
- Experience with batch processing
- Experience working with Hortonworks or Cloudera (preferred)
- Data Torrent / Apex and Pentaho is a plus
- Experience with Navigator is a plus
- Experience with REST API is a plus
- Experience with streaming processing is a plus
- Exposure to encryption tools (HP Voltage) is a plus
- Exposure to NoSQL store is a plus