Big Data Developer

Hadoop, Hive, ETL
Full Time, Contract W2
Depends On Experience
Telecommuting not available Travel not required

Job Description



Big Data Developer

Position Description
Experience and Responsibilities:

• Experience with Big Data tools and technologies

• Multiple Hadoop Platform implementations

• Experience with large-scale distributed applications

• Analytical and problem solving skills, applied to a Big Data environment

• Big Data Development using Hadoop Echo system including Hive, Sqoop,Spark and other Cloudera tools

• Proven understanding and related experience with Hadoop, HBase, Pig, Flume, Map/Reduce , Kafka.

• At least 1+ years hands on Java development experience is required.

• At least 1+ years hands on Python development experience is required.

• Hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef, Puppet)

• Experience with agile/scrum methodologies to iterate quickly on product changes, developing user stories and working through backlog

• Prefer experience with Cloudera Hadoop distribution components and custom packages

• Banking experience preferred.

• Traditional Data Warehouse/ETL experience required

• Excellent planning, organization, communication and thought leadership skills

• Coordinate with Team for project deliverables , Lead and document project status meetings

• Highly organized with good time management skills and Customer service orientation

Posted By

11111 Carmel Commons Blvd, Ste 155 Charlotte, NC, 28226

Dice Id : 10313336
Position Id : 953589
Have a Job? Post it