Mid level ETL developer with some hands on experience with Hadoop.
Essential Duties and Responsibilities:
Following is a summary of the essential functions for this job. Other duties may be performed, both major and minor, which are not mentioned below. Specific activities may change from time to time.
1. ETL developer with some level of Hadoop experience
2. Sound understanding and experience with Hadoop ecosystem (Cloudera). Able to understand and explore the constantly evolving tools within Hadoop ecosystem and apply them appropriately to the relevant problems at hand.
3. Experience in working with a Big Data implementation in production environment
4. Experience in HDFS, Map Reduce, Hive, impala, Linux/Unix technologies is mandatory
5. Experience in spark is an added advantage
6. Experience in Unix shell scripting is mandatory
7. Able to analyze the existing shell scripts/python/perl code to debug any issues or enhance the code
8. Sound knowledge of relational databases (SQL) and experience with large SQL based systems.
9. Strong IT consulting experience in various data warehousing engagement, handling large data volumes, architecture big data environments.
10. Deep understanding of algorithms, data structures, performance optimization techniques and software development in a team environment.
11. Benchmark and debug critical issues with algorithms and software as they arise.
12. Lead and assist with the technical design/architecture and implementation of the big data cluster in various environments.
13. Able to guide/mentor development team for example to create custom common utilities/libraries that can be reused in multiple big data development efforts.
14. Exposure to ETL tools e.g. data stage, NoSQL (HBase, Cassandra, MongoDB), desirable
15. Work with line of business (LOB) personnel, external vendors, and internal Data Services team to develop system specifications in compliance with corporate standards for architecture adherence and performance guidelines.
16. Provide technical resources to assist in the design, testing and implementation of software code and infrastructure to support data infrastructure and governance activities.
17. Support multiple projects with competing deadlines
1. Previous experience in the financial services industry
2. Previous experience in workload migration from Legacy SQL to Hadoop
3. Broad BofA technical experience and good understanding of existing testing/operational processes and an open mind on how to enhance those
4. Understanding of industry trends and relevant application technologies
5. Experience in designing and implementing analytical environments and business intelligence solutions
6. Experience working in Agile development shop
1st shift (United States of America)
Hours Per Week: