Java/BigData Engineer (AWS, EMR, Event Driven Architecture)

company banner
DISYS - Digital Intelligence Systems, LLC
Scala, Machine learning, Microservices, RESTful, Microsoft Windows Azure, Python, Elasticsearch, Google Cloud, NoSQL, Snow flake schema, Docker, ECS, ELT, EMR, ETL, Golang, DaaS, Data analysis, Data architecture, Apache Kafka, Apache Maven, Apache Spark, Apache Velocity, Artificial intelligence, Big data, CAN, Cloud, Code coverage, Amazon Web Services, Ansible, Apache Cassandra, Apache Flink, Data processing, Amazon DynamoDB, Amazon Kinesis, Amazon RDS, Amazon SQS, Analytics, DevOps, Graph databases, Java, Jenkins, Kanban, Streaming
Contract W2, Contract Independent, Contract Corp-To-Corp, 18 Months
$0 - $0
Work from home available Travel not required

Job Description

MANAGER NOTES:

Looking for Sr developers who will be able to jump right in. Not looking for Jr- Mid level. They are not looking for candidates who have more lead experience, they need strong hands on developers who can code

MUST HAVE: They have had roles like this in the past and have not had great luck. Mostly finding that candidates who are strong with Java struggled with hands-on AWS component. They need to be strong with both Java and AWS.  Also, candidates have not been able to speak to their experience so they want to ensure that candidates can truly speak to their experience. They have had luck with candidates from Capital one since that company went through the digital transformation that PI is currently going through. Target candidates from Capital One, as they will receive preference.

Required and Preferred skills outlined in more detail below:

If any candidate meets 70% of mandatory skills outlined below would be preferable to proceed with the interview.

Software Engineering (object oriented programming)

Java – mandatory

Complete CICD ( understands automation, code coverage, end to end)   – mandatory

python, Scala, Golang – desired

Event driven & Stream processing

AWS SNS, AWS SQS, AWS Kineses, AWS Kinesis Analytics  - mandatory

Kafka, Flink, Spark Streaming - desired

Databases & APIs

DynamoDB or Cassandra - mandatory

AWS RDS – mandatory

REST APIs and AWS ECS - mandatory

Snowflake - Desired

BigData & AI/ML Model deployment

AWS EMR, AWS Glue - mandatory

AWS Sagemaker – Desired

Interview Process: 1 30 minute round with the manager and then if that goes well, will move forward with an hour video interview with 2 tech leads.

Overview: The individual on this team will build applications(object oriented programming) to hyper-personalize the customer experience(Databases & APIs) by leveraging the near real-time(event driven & stream processing) and batch data, AI/ML models with reinforcement learning (BigData & AL/ML deployment). This is for the personal investing group.

 

JOB DESCRIPTION

We are currently sourcing for a Software Engineer with strong AWS Experience to work at CLIENT in Durham, NC!

Role Statement

  • If you are an experienced Software engineer with a passion for crafting and delivering big data solutions using groundbreaking technologies, want to be a part of exciting data simplification journey, looking for a collaborative team environment where you will have wealth of opportunities to innovate and have intellectual curiosity to learn, a career in Customer Data Technologies in PI may be right for you!
  • The Customer Data Technology group within CLIENT’s Personal Investing (PI) organization is seeking a Principle Big Data Engineer to build and maintain large scale data processing systems. In this role, you will apply variety of technologies to develop innovative Big data solutions. This position is a critical element to delivering CLIENT’s promise of creating the best customer experiences in financial services.

The Expertise we’re looking for

  • Bachelor’s or Master’s Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required.
  • 5+ years of experience in implementing Big data solutions in data analytics space
  • 2+ years of experience in developing Big data applications in Cloud (AWS, Azure, Google Cloud)
  • Extensive experience in Object Oriented Programming (Java, Scala, Python), MessagingTechnologies (Kafka, Kinesis, SNS, SQS), Relational and NoSQL databases (DynamoDB, Elastic search, Graph DB), Stream Processing (Flink, Kinesis Analytics, Spark), data movement technologies (ETL/ELT), REST APIs and in-memory technologies.
  • Strong knowledge of developing highly scalable distributed systems using AWS services and Open source technologies and
  • Experience in deploying Machine Learning models with reinformcement learning in highly scalable environemnts
  • Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker)
  • Solid experience in Agile methodologies (Kanban and SCRUM)

The Purpose of your role: Principal Software Engineer in PI Customer Data Technologies is responsible for design and development of highly available, scalable and distributed data platforms using open source frameworks to process high volume, high velocity and wide variety of structured and unstructured data. The Principal Software Engineer will also be a technical lead for the team and has responsibility for solution design, solving key technical challenges and mentoring the team.

The Skills you bring

  • Strong technical design and analysis skill
  • Ability to deal with ambiguity and work in fast paced environment
  • Deep experience supporting mission critical applications quickly
  • Excellent communication skills, both through written and verbal channels
  • Excellent collaboration skills to work with multiple teams in the organization
  • Ability to understand and adapt to changing business priorities and technology advancements
  • Strong knowledge and technology trends in implementing of Big data ecosystem
  • Strong teammate and able to mentor junior team members
  • Solid understating of data architecture patterns such as Lambda, Kappa, Event driven Architecture, Data as a Service, Microservice etc.
  • Strategic thinking and critical problem solving skills

The Value you deliver

  • Designing, Building and supporting mission critical applications to provide the best customer experience
  • Exploring new technology trends and using them to simplify our data ecosystem
  • Driving Innovation and leading the team to implement solutions with future thinking
  • Collaborating with internal and external teams to deliver technology solutions for the business needs
  • Guiding teams to improve development agility and efficiency
  • Resolving technical roadblocks to the team and mitigating potential risks
  • Delivering system automation by setting up continuous integration/continuous delivery pipelines
  • Acting as a technical mentor to the team and bringing them up to speed on latest data technologies and promoting continuous learning

How your work impacts the organization: Customer Data Technology in PI supports the platforms that enable business users to collect and analyze the customer data needed to provide the best customer experience.



Company Information

Dice Id : 10110693
Position Id : 332013
Originally Posted : 2 months ago

Similar Positions at DISYS - Digital Intelligence Systems, LLC