"U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are unable to sponsor at this time."
- Implement end to end Hadoop solutions with a deep understanding of the Hadoop Ecosystem
- Integrate technical functionality (e.g. scalability, security, performance, data recovery, reliability, etc.)
- Research, evaluate, architect, deploy new tools, frameworks and patterns to build sustainable Big Data platforms
- Design and implement complex highly scalable statistical models and solutions that comply with security requirements
- Expert knowledge of whole Hadoop ecosystem like HDFS, Hive , Yarn, Flume, Oozie, Flume, Kafka, Storm, Spark, Scala and Spark Streaming including Nosql database knowledge
- Good understanding of Hadoop Administration.