Top 3 requirements: 6 years of experience in Hadoop 2 years of experience architecting and designing data projects in large enterprise environments (preferably in a "Principal" role or other very Senior level role) Expertise in Java/Scala/Python, SQL, NoSQL, Scripting, Teradata, Hadoop (Sqoop, Hive, Pig, Map Reduce), Spark (Spark Streaming, MLib), Kafka or equivalent Cloud Bigdata components Plus: Retail / EComm Environment Experience
-Working with their Enterprise Hadoop Platform
-Currently doing migration from old version up to the new high version CDP (Cloud Data Platform) within Hadoop - they have plans to move onto the Apache version of Hadoop (more opensource) in the future
-Taking the entire platform from managed cloud data to open source - requires lots of planning,
-Good understanding of deployments, engineering, devops, data centers, etc.
-Drive this entire initiative at different levels - coming up with that plan & leading teams to execute on it
- "what does the enterprise wide data look like 2-3 years from now?"
Apex Systems is an equal opportunity employer. We do not discriminate or allow discrimination on the basis of race, color, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related medical conditions), age, sexual orientation, gender identity, national origin, ancestry, citizenship, genetic information, registered domestic partner status, marital status, disability, status as a crime victim, protected veteran status, political affiliation, union membership, or any other characteristic protected by law. Apex will consider qualified applicants with criminal histories in a manner consistent with the requirements of applicable law. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation in using our website for a search or application, please contact our Employee Services Department at