Job Title: Senior Ab Initio Developer with Hadoop
Job Type: Full Time
Job Location: Multiple Locations
Duties & Responsibilities:
We are looking for a Senior Data Engineer to be part of our scrum teams and perform functional & system development for Hadoop applications for our Enterprise Data Lake initiative.Role Summary/Purpose:
This is high visibility fast paced key initiative will integrate data across internal and external sources, provide analytical insights and integrate with our critical systems.
- Participate in the agile development process
- Develop functional and technical specifications from business requirements for the commercial platform
- Ensure application quality and adherence to performance requirements
- Help create project estimates and plans. Represent engineering team in project meetings and solution discussions
- Participate code review process
- Work with team members to achieve business results in a fast paced and quickly changing environment
- Pair up with data engineers to develop cutting edge Analytic applications leveraging Big Data technologies: Hadoop, NoSQL, and In-memory Data Grids
- Mentor and influence up and down the chain of command
- Perform other duties and/or projects as assigned
- Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) and a minimum of 10 years of experience
- Minimum 5 years’ experience working with and developing big data solutions
- Experience with big data tools: Hadoop, Spark, Hive, Sqoop, HBASE, Kafka, etc.
- Experts in the following Ab Initio tools: GDE – Graphical Development Environment; Co>Operating System ; Control Center; Metadata Hub; Enterprise Meta>Environment; Enterprise Meta>Environment Portal; Acquire>It; Express>It; Conduct>It; Data Quality Environment; Query>It.
- Hands-on experience on writing shell scripts, complex sql queries, Hadoop commands, and Git
- Ability to write abstracted, reusable code components
- Programming experience in at least two of the following languages: Scala, Java or Python
- Strong business acumen
- Critical Thinking and Creativity
- Performance tuning experience
- Experience in developing Hive, Sqoop, Spark, Kafka, HBase on Hadoop
- Familiar with Ab Initio, Hortonworks, Zookeeper, and Oozie is a plus
- Willingness to learn new technologies quickly
- Superior oral, and written communication skills, as well as the willingness to collaborate across teams of internal and external technical staff, business analysts, software support and operations staff.
- Strong business acumen including a broad understanding of Synchrony Financial business processes and practices
Interested candiadtes can apply to the below link.