· 8+ Years of Architecture and hands-on development of Enterprise Data Warehouse environment with focus on Data Integration (ETL) using any data Integration tool – Informatica, Datastage, Talend, Ab Initio, Pentaho
· 2+ Years hands-on experience with Talend Big Data Integration version i.e. Design, develop ETL scripts, creating and deploying end to end Talend Data Big data Integration solution.
· Expert in ETL concepts of data integration, data migrations, data flow, data enrichment, data synchronization, change data capture and transformations.
· Hands on experience with data profiling, data modeling, development of staging ODS/EDW, and presentation layer (Star/Snow-Flake).
· Expert level understanding of ETL frameworks – Developing Audit, Balance, Control; Validation architecture, Reconciliation etc.
· Proficient with SQL, Complex SQL Tuning, Stored Procedures, Data Warehousing & Data Integration best practices etc.
· Collaborate across multiple teams to research, architect, engineer and configure complex Data Integration solutions to specified requirements in support of global, business critical systems
· Coordinate and oversee the assignments, delivery, and quality of deliverables.
· Ability to work with Senior Enterprise Architects and Data Architects to develop overall Data Integration roadmap.
· Working experience with Hadoop ecosystem technologies (Hive, Pig, Spark), Distributed scalable data stores (HBase, Redshift), relational and NoSQL databases (Mongo DB, Cassandra etc ), Business intelligence tools and platforms/data quality tools would be advantage.
· Working experience with Cloud Technologies AWS, Azure, Google cloud, Snowflake etc would be a big plus.
· Experience in Agile.
· Excellent communication, presentation & documentation skills.
· Must be a team player with knowledge sharing capability.