Senior Data Engineer with Deep Ab Initio and Hadoop exp - Atlanta, GA (Onsite)

Overview

On Site
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 month(s)
No Travel Required

Skills

Big data
JAVA
Python
data engineer
Hadoop
Data Quality
Data Lake
Metadata
Ab Initio

Job Details

Role Name: Senior Data Engineer with Deep Ab Initio and Hadoop experience
Location: Atlanta, GA (Onsite)
Duration: 12 Months

 

JOB DESCRIPTION:
We are looking for a Senior Data Engineer to be part of our scrum teams and perform functional and system development for Hadoop applications for our Enterprise Data Lake initiative.

This is a high-visibility, fast-paced, key initiative that will integrate data across internal and external sources, provide analytical insights, and integrate with our critical systems.

Essential Responsibilities:
• Participate in the agile development process
• Develop functional and technical specifications from business requirements for the commercial platform
• Ensure application quality and adherence to performance requirements
• Help create project estimates and plans, and represent the engineering team in project meetings and solution discussions
• Participate in the code review process
• Work with team members to achieve business results in a fast-paced and quickly changing environment
• Pair up with data engineers to develop cutting-edge analytic applications, leveraging Big Data technologies such as Hadoop, NoSQL, and in-memory data grids
• Mentor and influence up and down the chain of command
• Perform other duties and/or projects as assigned

Qualifications / Requirements:
• Bachelor’s degree in a quantitative field, such as Engineering, Computer Science, Statistics, or Econometrics, and a minimum of 10 years of experience
• Minimum of 5 years’ experience working with and developing big data solutions
• Expertise in the following Ab Initio tools: GDE (Graphical Development Environment), Co-Operating System, Control Center, Metadata Hub, Enterprise Meta-Environment, Enterprise Meta-Environment Portal, Acquire-It, Express-It, Conduct-It, Data Quality Environment, and Query-It
• Hands-on experience writing shell scripts, complex SQL queries, Hadoop commands, and Git
• Ability to write abstracted, reusable code components
• Programming experience in at least two of the following languages: Scala, Java, or Python

Desired Characteristics:
• Strong business acumen
• Critical thinking and creativity
• Performance tuning experience
• Experience developing Hive, Sqoop, Spark, Kafka, and HBase on Hadoop
• Familiarity with Ab Initio, Hortonworks, Zookeeper, and Oozie is a plus
• Willingness to learn new technologies quickly
• Superior oral and written communication skills, as well as the willingness to collaborate across teams of internal and external technical staff, business analysts, software support, and operations staff
• Strong business acumen, including a broad understanding of Synchrony Financial business processes and practices

Skill: Ab Initio

 

Best Regards,

Chetna

Truth Lies in  Heart

 

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.