Hadoop Developer HDF (Hortonworks DataFlow)

See job description
Full Time
Telecommuting not available Travel not required

Job Description

Job Description:

Karsun Solutions - We are Enterprise Modernization Experts!

Our portfolio of long-term contracts help transform business enterprises.

Our Innovation Center provides freedom to explore and experiment.

Our culture is based on true work-life balance and opportunities to learn.

Karsun is a premier consulting services company with a reputation for innovation. We provide Enterprise Modernization services to both civilian and defense Federal agencies. Our company is recognized as one of the fastest-growing private companies in the U.S., as well as one of the most promising Government Consulting Solutions Providers. Karsun is an ISO 9001:2008 certified organization.

Technical Responsibilities:

• Work with master data management (MDM) and data integration technologies to design and implement new solutions and processes for clients to support required data governance and management strategies
• Work with business and technology client representatives to gather functional and technical requirements
• Analyze requirements and provide leading practices in the design of the solution
• Create data models
• Install and Configure MDM tools to meet business needs
• Participate in client facing meetings and documenting key decisions and action items
• Serve as an SME on Master Data Management and Data governance
• Keep informed of the latest technology trends and innovations especially in the areas of data integration, master data management, data management platforms, digital asset management, web content management

Functional Responsibilities:
• Create and maintain MDM data models and process documentation as required
• Create and maintain best practices oriented end user, information steweards and platform administrator guides

NOTE: This is not a remote position. This position will be located onsite in Herndon, VA during core business hours (9am-5pm).

Required Skills:
• Bachelor's degree in Computer Science or related discipline

• US Citizenship or Green Card Holder required.
• Must be able to obtain a Public Trust (Moderate level) security clearance
• 5+ years of experience in ingesting data to Hadoop from variety of sources like ERP, CRM, NoSQL and transactional data systems
• 3+ years of experience in building Hadoop based ETL workflows
• Experience in monitoring performance and implementing any necessary infrastructure changes to meet performance capabilities and goals
• 2+ years of hands-on experience with the Hortonworks Data Platform - Hadoop v2, Apache NiFi/Hortonworks DataFlow, Spark ecosystem, MapReduce
• 5+ years of experience in building and operationalizing Hadoop based data lake
• 5+ years in Big Data querying tools, such as Pig, Hive, and Drill
• 2+ years of experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming or HDF

Experience in implementing Big Data ML toolkits, such as Amazon ML, SparkML, or H2O
• Experience with use and integration of NoSQL databases, such as HBase, Cassandra, MongoDB
• Knowledge of various ETL tools in Hadoop like Pentaho Data Integrator, SAP Data Services, etc.
• Experience with various messaging systems, such as Kafka
• Experience in code/build/deployment tools like git, svn, maven, sbt, jenkins
• Ability to work through ambiguity and maintain task focus towards delivery with minimal supervision is key
• Ability to work under challenging deadlines and constraints to deliver

Desired Skills:
• Excellent communication skills with experience in business requirements definition and creating clear documentation
• Experience in streaming analytics and data integration is highly desirable
• Experience with ETL tools such as SAP DS, Information Steward, Talend, Pentaho PDI is desirable
• Basic demonstrable Linux skills, including the ability to write simple shell scripts
• Python scripting skills are highly advantageous
• Self-starting with substantial motivation and ability to learn new tools and build expertise
• Experience with the Hadoop ecosystem and related analytics tools such as Spark, Zeppelin and Hue, is highly desirable
• Experience with data management – ETL, MapReduce, HDF, Streamsets
• Strong methodical approach to problem solving with very strong hands-on coding
skills are expected
• Extensive knowledge of the SDLC with working experience in Agile methodologies

Karsun Solutions is an Equal Employment Opportunity (EEO) employer. It is the policy of the Company to provide equal employment opportunities to all qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran or disabled status, or genetic information.

Dice Id : RTX15a3f1
Position Id : 97955990
Have a Job? Post it