Cloudera Hadoop Big data Architect/Administrator/Developer

Amazon EC2, Amazon Web Services, Java, AWS, AngularJS, Architecture, Communication skills, Continuous integration, Distributed computing, IaaS, J2EE, JavaScript, Leadership, Planning, Presentations, Problem solving, Project management, Requirements analysis, Risk assessment, SOA, SOLID, SaaS, Software, Software development, Solution architecture, Spring, TypeScript, Artificial intelligence, Federal government, Machine learning, Software design, Data analysis, Business process, Automation, Deep learning, IT, Cloudera
Contract W2, Contract Independent, Contract Corp-To-Corp
Depends on Experience

Job Description

Role: Cloudera Hadoop/Big data Consultant

Location: Alexandria, VA.

Duration : Long term

Role & Responsibilities:

Design and implement Big Data analytic solutions on a Hadoop based platform. Create custom analytic and data mining algorithms to help extract knowledge and meaning from vast stores of data. Refine a data processing pipeline focused on unstructured and semi-structured data refinement. Support quick turn and rapid implementations and larger scale and longer duration analytic capability implementations.

  • Hadoop development and implementation.
  • Loading from disparate data sets.
  • Pre-processing using Hive and Pig.
  • Designing, building, installing, configuring and supporting Hadoop.
  • Translate complex functional and technical requirements into detailed design.
  • Perform analysis of vast data stores and uncover insights.
  • Maintain security and data privacy.
  • Create scalable and high-performance web services for data tracking.
  • High-speed querying.
  • Managing and deploying HBase.
  • Propose best practices/standards.


  • Experience with Hadoop and the HDFS Ecosystem
  • Strong Experience with Apache Spark, Storm, Kafka is must.
  • Experience with Python, R, Pig, Hive, Kafka, Knox, Tomcat and Ambari
  • Experience with MongoDb
  • A minimum of 4 years working with HBase/Hive/MRV1/MRV2 is required
  • Experience in integrating heterogeneous applications is required
  • Experience working with Systems Operation Department in resolving variety of infrastructure issues
  • Experience with Core Java, Scala, Python, R
  • Experience on Relational Data Base Systems(SQL) and Hierarchical data management
  • Experience with MapReduce
  • Experience to ETL tools such as Sqoop and Pig
  • Data-modeling and implementation
  • Experience with any machine learning or AI experience is a big plus with Python/TensorFlow experience.
Dice Id : 10491343
Position Id : 6661098
Originally Posted : 5 months ago
Have a Job? Post it

Similar Positions

AWS Solution Architect
  • CompuGain Corporation
  • Washington, DC
AWS Architect
  • Ledgent Technology
  • Columbia, MD
Java/AWS Architect
  • MWIDM Inc.
  • Herndon, VA