Data Engineer Hadoop

Contract W2
Depends on Experience
Work from home not available Travel not required

Job Description

Our client, a leading global financial services company, has approximately 200 million customer accounts and does business in more than 140 countries. They provide consumers, corporations, governments and institutions with financial products and services, including consumer banking and credit, corporate and investment banking, securities brokerage, transaction services, and wealth management.

Key Activities Please list in order of importance/time spent (highest to lowest)
- Define needs around maintainability, testability, performance, security, quality and usability for data platform
- Drive implementation, consistent patterns, reusable components, and coding standards for data engineering processes
- Work with the Business Analysts and Customers throughout the requirements process to properly understand the long term goals of the program and where they fit in the overall UI infrastructure
- Communication of new technologies, best practices, etc. to developers, testers, and managers.
- Mentoring and peer review of designs and coded implementations
- Work with technical specialists (Security Team, Performance Engineer, etc.) to ensure that all parties understand the system that is being designed and built and that all major issues are understood and mitigated.
- Expected to participate in several implementation phases of product development cycle design, scoping, planning, implementation and test.
- Integral team member of our AI and Analytics team responsible for design and development of Big data solutions Partner with domain experts, product managers, analyst, and data scientists to develop Big Data pipelines in Hadoop or Google Cloud Platform Responsible for delivering data as a service framework from Google Cloud Platform
- Responsible for moving all legacy workloads to cloud platform
- Work with data scientist to build Client pipelines using heterogeneous sources and provide engineering services for data science applications
- Ensure automation through CI/CD across platforms both in cloud and on-premises
- Ability to research and assess open source technologies and components to recommend and integrate into the design and implementation
- Be the technical expert and mentor other team members on Big Data and Cloud Tech stacks

Some of the best practices supported include but are not limited to:
- Achieving 85% code coverage with use of TDD (Test Driven Development),
- Leveraging automated testing on 100% of API code
- Leveraging automated testing for Continuous Improvement Continuous Development (functional & performance)
- Ensuring frequent check in of code and supporting Peer Code Reviews

Education level
and/or relevant experience(s) Required:
- BS/BA degree or equivalent combination of education/experience.
- Intermediate to senior level experience in an Apps Development role. Demonstrated strong execution capabilities.
Knowledge and skills
(general and technical) Required:
5+ years of experience with Hadoop (Cloudera) or Cloud Technologies Expert level building pipelines using Apache Beam or Spark Familiarity with core provider services from AWS, Azure or GCP, preferably having supported deployments on one or more of these platforms
Experience with all aspects of DevOps (source control, continuous integration, deployments, etc.)
Experience with containerization and related technologies (e.g. Docker, Kubernetes)
Experience in other open-sources like Druid, Elastic Search, Logstash etc is a plus
Advanced knowledge of the Hadoop ecosystem and Big Data technologies Hands-on experience with the Hadoop eco-system (HDFS, MapReduce, Hive, Pig, Impala, Spark, Kafka, Kudu, Solr)
Knowledge of agile(scrum) development methodology is a plus
Strong development/automation skills
Proficient in programming in Java or Python with prior Apache Beam/Spark experience a plus.
System level understanding - Data structures, algorithms, distributed storage & compute
Can-do attitude on solving complex business problems, good interpersonal and teamwork skills
- Angular.JS 4 Development and React.JS Development expertise in a up to date Java Development Environment with Cloud Technologies
- Exposure and/or development experience in Microservices Architectures best practices, Java Spring Boot Framework (Preferred), Docker, Kubernetes
- Experience around REST APIs, services, and API authentication schemes
- Knowledge in RDBMS and NoSQL technologies
- Exposure to multiple programming languages
- Knowledge of modern CI/CD, TDD, Frequent Release Technologies and Processes (Docker, Kubernetes, Jenkins)
- Exposure to mobile programming will be a plus.
Other Requirements (licenses, certifications, specialized training, physical or mental abilities required)
- Successfully complete assessment tests offered in Pluralsight, Udemy, etc. or complete certifications to demonstrate technical expertise on more than one development platform.


Please see our complete list of jobs at:

Posted By

1185 6th Ave., 3nd Floor New York, NY, 10036

Dice Id : gsc
Position Id : 19-01012
Originally Posted : 2 months ago
Have a Job? Post it

Similar Positions

Hadoop Developer
  • Data Incorporated
  • Addison, TX
Hadoop developer
  • MarvelousTek
  • Dallas, TX
Hadoop Developer
  • VGB Technologies PVT LTD
  • Plano, TX
.Hadoop & Developer
  • Quantum Technologies LLC
  • Dallas, TX
Hadoop Developer
  • United Software Group
  • Addison, TX
Hadoop Administrator
  • Beacon Hill Technologies
  • Irving, Tx, TX
Senior Developer - Hadoop
  • Infinity Consulting Solutions
  • Richardson, TX
Jr. Hadoop Developer
  • Purview Infotech
  • Dallas, TX
Hadoop Developer
  • Matlen Silver
  • Plano, TX
Big Data/Spark Engineer
  • Anblicks
  • Dallas, TX
Senior Hadoop Developer - Powerful Salary - Richardson, TX
  • FRG Technology Consulting
  • Richardson, TX
Hadoop DevOps Engineer
  • Allstate Insurance Company
  • Irving, Tx
Hadoop Developer (Richardson,TX)
  • MATRIX Resources, Inc.
  • Richardson, TX