Job Description and Responsibilities:
The candidate�s primary role will be a Hadoop Developer. As a developer on the team, the resource will be writing Linux Scripts, setting up Autosys jobs, writing Pig Scripts, Hive queries, Oozie workflows and Map Reduce programs. In addition to development activities, the resource will participate in analysis and design as well as complete project documentation.
This goal of this project is to transfer an ETL batch process from mainframe/teradata to Hadoop.
Must have these skills:
1. Minimum 4 years experience using Core Java
2. Must have 4+ years of development experience in any of the RDBMS like
� MS SQL etc.
3. Experience writing Shell scripts in LINUX or Unix.
4. Experience with Autosys
5. Must have excellent and in-depth knowledge in SQL
6. Experience in analyzing text, streams with emerging Hadoop-based big data, NoSQL.
� Hands on experience with Running Pig and Hive Queries.
� Analyzing data with Hive, Pig and HBase
� Hand on experience with Oozie.
� Importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa.
� Loading data into HDFS.
� Developing MapReduce Program to format the data.
7. Experience developing a batch process
8. Have the zeal to contribute, collaborate and work in a team environment
9. Must have excellent communication skills.