BIGDATA DEVELOPER/ARCHITECT

JAVA, HADOOP, Spark, WORKFLOW, Hive, bigdata, ZOOKEPER
Contract Corp-To-Corp, Contract Independent, Contract W2, 6 MONTHS
Depends On Experience
Work from home not available Travel not required

Job Description

Hi,

Hope you are doing well

We have very good position with our client, please let me know if you are comfortable with the below job description with your updated resume.

Title: Big Data Developer/ Architect

Location: Monroe, LA.

Duration: 06+ months

Interview: Phone than Skype.

Need a very strong credential resource.

Position Overview:

  • As a Big Data (Hadoop) Architect/Developer, will be responsible for Cloudera Hadoop development, high-speed querying, managing and deploying Flume, Kafka, HIVE and Spark, and oversee handover to operational teams and propose best practices / standards.
  • Expertise with Designing, building, installing, configuring and developing Hadoop echo system. Familiarity with Pentaho and Nifi a bonus skillset.

Principal Duties and Responsibilities

  • Work with development teams within the data and analytics team to design, develop, and execute solutions to derive business insights and solve clients' operational and strategic problems.
  • Support the development of data and analytics solutions and product that improve existing processes and decision making.
  • Build internal capabilities to better serve clients and demonstrate thought leadership in latest innovations in big data, and advanced analytics.
  • Contribute to business and market development.

Specific skills and abilities:

  • Defining job flows
  • Managing and Reviewing Hadoop Log Files
  • Manage Hadoop jobs using scheduler
  • Cluster Coordination services through Zookeeper
  • Support MapReduce programs running on the Hadoop cluster
  • Ability to write MapReduce jobs
  • Experience in writing Spark scripts
  • Hands on experience in HiveQL
  • Familiarity with data loading tools like Flume, Sqoop
  • Knowledge of workflow/schedulers like Oozie
  • Knowledge of ETL tools like Pentaho


Qualifications & Skills:

  • Bachelor's degree or related technical field preferred
  • Expertise with HBase, NOSQL, HDFS, JAVA map reduce for SOLR indexing, data transformation, back-end programming, java, JS, Node.js and OOAD
  • 7 + years' of experience in IT with minimum 2 years' of experience in Hadoop.

Thanks & Regards

Vaishali Rana

Technical Recruiter

Xchange Software Inc

10 Austin Avenue, Iselin, NJ - 08830.

Phone: 732.313.2023

Fax: 732.601.4641

vaishali.rana@xchangesoft.net

www.xchangesoft.com

This message and any attachments are intended only for the use of the addressee and may contain information that is privileged and confidential. If the reader of the message is not the intended recipient or an authorized representative of the intended recipient, you are hereby notified that any dissemination of this communication is strictly prohibited. If you have received this communication in error, notify the sender immediately by return email and delete the message and any attachments from your system.

Posted By

Vaishali Rana

Dice Id : 10423087
Position Id : 2020-1259
Originally Posted : 1 month ago
Have a Job? Post it

Similar Positions

Big Data Architect
  • American Cybersystems, Inc.
  • Monroe, LA
Big Data Architect
  • INFOMATICS
  • Monroe, LA
Big Data Architect (Contract to Hire)
  • Vaco Technology
  • Monroe, LA
Senior Data Engineer - Big Data
  • Louisiana Economic Development
  • Baton Rouge, LA
Hadoop Developer
  • Best High Technologies
  • Plano, TX
Hadoop Developer
  • Matlen Silver
  • Plano, TX
Big Data Developer
  • Themesoft Inc
  • Tx
AWS Big Data Engineer - 12 Month Contract
  • Jefferson Frank
  • Dallas, TX
Big Data Hadoop Developer
  • OSI Engineering, Inc.
  • Plano, TX
Big Data Lead
  • ApTask
  • Dallas, TX
Horton Works Big Data Developer.
  • Nestortechnologies Inc
  • Dallas, TX