Position 1:
Cassandra (production Support) Admin/Developer ------ with Hadoop exp would be Plus
Santa Clara, CA
Position Type: Permanent
Position: 2
Urgent Hadoop opening
Santa Clara, CA
Hands on expertise in Hive.
Skill level : Expert
Should have experience data modeling from Teradata schemas into Hadoop
Use hive for data management and integration from staging cluster to semantic analysis
Must be able to provide thought leadership
Previous consulting skills and client facing role a big plus
Position: 3
Hadoop Admin
Location of Requirement: Bay Area, CA
Required Skill Set: Hadoop, HBase, Big Data, NoSQL, Cassandra, Pig, Hive, Linux system administration
Desired Experience Range: 6-9 Years
Responsibility of / Expectations from the Role
Ideally someone with 2+ years experience in large scale Hadoop cluster management
The ability to balance and tune cluster nodes for performance (our clusters run Hive and HBase so familiarity with these is a real plus)
Excellent Linux administration skills and understanding of system performance areas (memory, swap, disk i/o, network i/o, etc.)
Demonstrated experience tuning large application systems. Must know how to benchmark and stress test under the highest simulated traffic conditions
BS in Computer Science, Math, or equivalent with good academic record would be plus.
Desired Candidate Profile
Must-Have: Map Reduce, Hadoop, HBase, Big Data, NoSQL, Cassandra, Pig, Hive, Linux admin
Position: 4
Big Data Engineer (Hadoop Engineer)
Location: Anywhere in the USA as long as they are willing to relocate to Santa Clara, CA, or Edison, NJ or Dallas , TX or Cincinnati Ohio.
However, if there are exceptional candidates, they may be allowed to work from their homes from anywhere in USA as long as they are willing to travel 50% or more.
Travel is required for all candidates at 50% +. Initially we prefer to hire more candidates in the Bay Area, CA region.
Position Summary
There are multiple Hadoop engineering positions within COMPANY Big Data Department based on education and experience. The ideal candidate will possess technical skills in software design and development as well as customer facing skills at all levels of a partner organization. A background in OO programming and Java is required. Prior development experience in Hadoop is required.
Desired Skills & Experience
3+ years of Hands-on experience in Core Java programming
Solid grasp of Object-Oriented Programming
Preferred: Experience in high scale or distributed RDBMS
Proficiency with Linux/Unix and open source tools/platforms
Experience building applications using Hadoop, Hive, MapReduce and other Hadoop technology stack.
Demonstrated ability to meet development milestones on time, in budget with high quality. Prior experience working in agile development methodologies (Scrum etc.) is preferred.
Excellent analytical, written and verbal communication skills
Immediate Availability
Technical Responsibilities:
Participate in requirements analysis and design of Big Data solutions
Develop and test Big Data solutions
Collaborate with Analytics teams to convert prototypes and concepts into efficient and scalable solutions
Adhere to all COMPANY Sales, Human Resource, and corporate ethical policies, standards and guidelines
Excellent written and verbal communication skills
Position: 5
Big Data Architect
Location: Anywhere in the USA as long as they are willing to relocate to Santa Clara, CA, or Edison, NJ or Dallas , TX or Cincinnati Ohio.
Travel is required for 25% to 50% +.
Position Summary
We are looking for the Big Data Department is looking a Software Architect with solid experience in Hadoop technologies. The ideal candidate will be responsible for the architecture, design and development of data-centric solutions for our clients. The ideal candidate will work closely with various stakeholders in business and IT to envision and implement modern data platform architectures of high quality. The ideal candidate will have the ability and will to engage in a hands on manner when required.
Desired Skills & Experience
Innovative energetic individual comfortable working in a fast paced environment with
Minimum of 7 years of relevant experience with at least three years as a technical lead or architect
Hands-on experience with Hadoop technology stack (HDFS, MapReduce, Hive, Hbase, Flume)
Expertise in Java/JEE
Experience in high scale or distributed RDBMS
Proficiency with Linux/Unix and open source tools/platforms
Experience in Infrastructure and storage design
Demonstrated experience in the design and development of data architectures
Good understanding of Big Data products in the market
Experience delivering effectively in a fast paced environment with an ability to adapt quickly
Excellent written and verbal communication skills
Technical Responsibilities:
Design and build robust Hadoop solutions for Big Data problems
Guide the full lifecycle for the Big Data solution including requirements analysis, technical architecture design, solution design, solution development, testing & deployment
Define and document architectural standards
Provide subject matter expertise and stay abreast of latest technologies to solve problems associated with business intelligence and analytics
Provide technical leadership & coaching to junior developers
Adhere to all COMPANY Sales, Human Resource, and corporate ethical policies, standards and guidelines.
Demonstrate strong personal communication skills
Must Have Technical Qualifications:
Hands-on experience with Hadoop technology stack (HDFS, MapReduce, Hive, Hbase, Flume)
Expertise in Java/JEE
Experience in high scale or distributed RDBMS
Copyright ©1990 - 2013 Dice. All rights reserved. Use of this site is subject to certain Terms and Conditions.