Sr Hadoop Engineer

Minimum Requirements: * Bachelors degree in Computer Science or a related field and/or equivalent experience. * Previous experience administering Hadoop clusters. * 5years Java deve
Full Time
Telecommuting not available Travel not required

Job Description


At ACI Worldwide, we are ACHIEVERS, COLLABORATORS, INNOVATORS. We come together from across the GLOBE with a singular goal to POWER global commerce with our INNOVATIVE technology solutions.

ACI Worldwide, the Universal Payments company, powers electronic payments and banking for more than 5,100 financial institutions, retailers, billers and processors around the world. ACI software processes $14 trillion in payments and securities transactions for more than 300 of the leading global retailers, and 18 of the worlds 20 largest banks.We have a definitive vision of how electronic payment systems will look in the future and we have the knowledge, scale and resources to deliver it. However, any payment portfolio would be incomplete without a fraud and analytics piece to round it out. ACI has a suite of world class Analytics and Fraud Prevention products and services based on very modern technologies such as containerization and Hadoop. As a Sr. Hadoop/Big Data Engineer, you too can help us drive payments at the speed of change.

Essentially this role will have two phases, namely:
  • The immediate need is to have a strong Hadoop presence to help us to operationalize the clusters both for the dev and prod environments

  • Longer term, the job will evolve into an engineering role with the data science organization and basically liaise between the Architects, Data Science team and the ops team as we define and deploy models and services for fraud and related analytics.

Essential Duties and Responsibilities:
  • Manages and participates in the day to day operational work across the Hadoop clusters

  • Works closely with the hosted ops colleagues to define operational best practice for the UAT and Prod Hadoop clusters.

  • Participates in project planning and review as they pertain to Hadoop and Hadoop clusters.

  • Development of software applications utilizing Hadoop and other big data technologies.

Minimum Requirements:
  • Bachelors degree in Computer Science or a related field and/or equivalent experience.

  • Previous experience administering Hadoop clusters.

  • 5years Java development experience.

  • 5 years of related work experience to include experience with systems testing and system requirements processes (planning, elicitation, analysis and management.)

  • 5or more years of experience working with Hadoop in a big data environment
Highly Desired Skills:
  • Experience with the Hortonworks Hadoop Distribution (HDP).

  • Experience with SPARK, HBase, Hive, NIFI, and Phoenix.

  • Experience with Cassandra, SOLR and Kafka.

  • Experience with agile software development methodologies (e.g. Scrum)

  • Familiar with containerization technology (e.g. Docker, Kubernetes)

  • Experience with machine learning using Hadoop or other frameworks.

  • Experience working with Kerberized Hadoop clusters.

  • Knowledge of Falcon, Atlas, Sqoop, Flume, Pig, Storm, Zeppelin, Jupyter, Ambari, Ranger, Knox, Zookeeper, Oozie.

In return for your expertise, we offer challenge, opportunity, and an excellent compensation and benefits package in a casual environment.Are you ready to help us transform the world of electronic payments?

Position will be posted for 5 business days or until the position is filled.

ACI Worldwide is an AA/EEO employer, which includes providing equal opportunity for protected veterans and individuals with disabilities.

Dice Id : RTX15a7e3
Position Id : 18000475
Have a Job? Post it