Overview
Skills
Job Details
Ab Initio Developer (AWS)
Seeking an Ab Initio Developer in Hadoop and AWS ecosystem! The selected candidate will be responsible for orchestrating, deploying, maintaining and scaling cloud OR on-premise infrastructure targeting big data and platform data management (Relational and NoSQL, distributed and converged) with emphasis on reliability, automation and performance. This role will focus on leading the development of solutions and helping transform the company's platforms deliver data-driven, meaningful insights and value to company. ESSENTIAL FUNCTIONS:
- 20% Develops and maintains infrastructure systems (e.g., data warehouses, data lakes) including data access APIs. Prepares and manipulates data using multiple technologies. - 15% Interprets data, analyzes results using statistical techniques, and provides ongoing reports. Executes quantitative analyses that translate data into actionable insights. Provides analytical and data-driven decision-making support for key projects. Designs, manages, and conducts quality control procedures for data sets using data from multiple systems. - 15% Develops data models by studying existing data warehouse architecture; evaluating alternative logical data models including planning and execution tables; applying metadata and modeling standards, guidelines, conventions, and procedures; planning data classes and sub-classes, indexes, directories, repositories, messages, sharing, replication, back-up, retention, and recovery. - 15% Creates data collection frameworks for structured and unstructured data. - 15% Improves data delivery engineering job knowledge by attending educational workshops; reviewing professional publications; establishing personal networks; benchmarking state-of-the-art practices; participating in professional societies. - 10% Applies data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources. - 10% Applies and implements best practices for data auditing, scalability, reliability and application performance.
- BS Degree in Computer Science, Information Technology or Engineering or related field is required.
- 5 years experience with database design and developing modeling tools. Experience developing and updating ETL/ELT scripts. Hands-on experience with Ab Initio ETL development in Hadoop and AWS ecosystem.
- Hands-on experience with Ab Initio ETL development in Cloudera, Hadoop, Hive and AWS ecosystem, relational database layout, development, data modeling.
- Hands on experience as a Big Data Engineer in Hadoop and AWS ecosystem in Healthcare industry; preferably BCBS.
- Hands on experience with developing application for batch data loads and data streaming using technologies using Cloudera/Hadoop and/or AWS technologies
- Strong technical and analytical and problem solving skills to troubleshoot to solve a variety of problems.,
- Requires strong organizational and communication skills, written and verbal, with the ability to handle multiple priorities.
- Healthcare payor industry experience is a big plus Preferred Qualifications
Knowledge and understanding of at least one programming language (i.e., SQL, NoSQL, Python).
Knowledge and understanding of database design and implementation concepts.
Knowledge and understanding of data exchange formats.
Knowledge and understanding of data movement concepts.