Overview
HybridCandidate needs to be in the office 3-4 Days every week.
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 Month(s)
No Travel Required
Unable to Provide Sponsorship
Skills
MarTech
Scala
Java
Python
Hive
Spark
Kafka
HBase on Hadoop
Hadoop
NoSQL
Job Details
Role: Abinitio Engineer (with Data Lake MarTech)
Locations: Atlanta, GA (Hybrid Onsite)
Duration: 12+ Months Contract
F2F Interview Highly Preferred for Local Candidates.
Note: Candidate needs to be in the office 3-4 Days every week. Local or candidates from adjacent states only.
Summary/Purpose:
- We are looking for a Senior Data Engineer to be part of our scrum teams and perform functional & system development for Hadoop applications for our Enterprise Data Lake initiative.
- This is high visibility fast paced key initiative will integrate data across internal and external sources, provide analytical insights and integrate with our critical systems.
Essential Responsibilities:
- Participate in the agile development process.
- Develop functional and technical specifications from business requirements for the commercial platform.
- Ensure application quality and adherence to performance requirements.
- Help create project estimates and plans. Represent engineering team in project meetings and solution discussions.
- Participate code review process.
- Work with team members to achieve business results in a fast paced and quickly changing environment.
- Pair up with data engineers to develop cutting edge Analytic applications leveraging Big Data technologies: Hadoop, NoSQL and In-memory Data Grids.
- Mentor and influence up and down the chain of command.
- Perform other duties and/or projects as assigned.
Qualifications/Requirements:
- Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) and a minimum of 10 years of experience
- Minimum 5 years’ experience working with and developing big data solutions
- Experts in the following Ab Initio tools: GDE – Graphical Development Environment; Co-Operating System; Control Center; Metadata Hub; Enterprise Meta Environment; Enterprise Meta Environment Portal; Acquire It; Express It; Conduct It; Data Quality Environment; Query It.
- Hands-on experience on writing shell scripts, complex SQL queries, Hadoop commands and Git.
- Ability to write abstracted, reusable code components.
- Programming experience in at least two of the following languages: Scala, Java or Python.
Desired Characteristics:
- Strong business acumen.
- Critical Thinking and Creativity.
- Performance tuning experience.
- Experience in developing Hive, Sqoop, Spark, Kafka, HBase on Hadoop.
- Familiar with Ab Initio, Hortonworks, Zookeeper, and Oozie is a plus.
- Willingness to learn new technologies quickly.
- Superior oral, and written communication skills, as well as the willingness to collaborate across teams of internal and external technical staff, business analysts, software support and operations staff.
- Strong business acumen including a broad understanding of Synchrony Financial business processes and practices.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.