Skills
- Computer engineering
- Collaboration
- Data
- Jenkins
- continuous integration and development
- Insurance
- MapReduce
- Leadership
- HDFS
- Streaming
Job Description
HTC Global Services wants you. Come build new things with us and advance your career. At HTC Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies.
At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life and Accidental Death Insurance, Short & Long Term Disability Insurance, and a variety of other perks.
Position Description:
- Utilize Hadoop Ecosystem to land, transform, and store data to make available for analytics.
- Make use of real-time streaming architecture to move and process data.
- Meet with business customers to design, suggest solutions to fulfill critical business needs.
- Implement solutions that meet IT standards, procedures, security and with quality.
- Act as a full stack developer by working with many disparate and diverse technologies.
- Actively participant in all agile ceremonies such as: backlog refinement, standup, iteration closure, iteration retrospective.
- Review ongoing production software operations and troubleshoot production issues.
- Utilize technical knowledge and connected vehicle architecture to suggest, design and implement optimal Big Data solution.
- Utilize Continuous Integration / Continuous Demand and Test-Driven Development to deliver software with quality.
- Lead architecture and design discussion to devise optimal solutions.
- Guide and coach other software engineers on best practices.
Skills Required:
- Firm understanding of the following big data technologies: MapReduce, Oozie, HIVE, HBase, HDFS, Spark, Storm, Kafka and Nifi.
- Additional technical experience required: PCF, Spring Boot, Java, and Linux experience.
- Experience with CI/CD systems such as Jenkins.
- Understanding of various big data batch and streaming architectures and design.
- Ability to utilize real-time streaming architecture to interact with, and land streaming data sources.
- Experience with Github and Accurev SCM systems.
- Experience with Agile practices.
- Self-starter and good communicator.
- Knowledge of analytics customer use cases.
Experience Required:
- Minimum of 5 years experience in the following big data technologies: MapReduce, Oozie, HIVE, HBase, HDFS, Spark, Storm, Kafka and Nifi.
Education Required:
- Required: Bachelor s Degree in Computer Science, Computer Engineering or another technical discipline.
- Preferred: Master s Degree in Computer Science, Computer Engineering or another technical discipline.