Detailed job description:
Work with business teams, end users, architects to gather business requirements.
Provide/Layout the technical design/solution in Ab Initio based on the business needs.
Design, Develop, test and migrate Ab Initio graphs capable of handling huge data processing with complex business rules and aggregations.
Develop Ab Initio graphs that can perform parallel data processing with data parallelism (MFS - multifile system), component parallelism and pipeline parallelism.
Implement complex data chaining techniques in Ab initio to maintain history data in data warehouses which would be used for exploratory analysis.
Implement Ab Initio graphs to ingest data, cleans, apply complex business rules, perform multi-level aggregations on the source data to finally feed the business intelligence reports and other dashboards.
Develop high performance Ab Initio graphs to process semi-structured data such as JSON,XML.
Developing data ingestion graphs that can ingest data from KAFKA , AWS S3 and Hadoop in near real time with proper balance and controls with reconciliation of data to ensure data accuracy with no data loss.
Design, develop the conditionalized, generic Ab Initio graphs , PSETs, transforms (XFRs) and DMLs to reduce code redundancy and better code maintenance and readability.
Work with business team to provide enhancements, performance improvements and to develop new capabilities/functionalities in existing Ab Initio graphs.