Overview
Skills
Job Details
10+ years of experience in big data and distributed computing.
Very Strong hands-on experience with PySpark, Apache Spark, and Python.
Strong Hands on experience with SQL and NoSQL databases (DB2, PostgreSQL, Snowflake, etc.).
Proficiency in data modeling and ETL workflows.
Proficiency with workflow schedulers like Airflow.
Hands on experience with AWS cloud-based data platforms.
Experience in DevOps, CI/CD pipelines, and containerization (Docker, Kubernetes) is a plus.
Strong problem-solving skills and ability to lead a team
Design, develop, modify, configure & debug Informatica workflows using Informatica PowerCenter and Power Exchange CDC tools.
Leads the design, development, and maintenance of data integration solutions using Informatica, ensuring data quality.
Troubleshoot and resolve technical issues. Debug, tune and optimize code for optimal performance.
Manage the new requirements, Review the existing jobs, Perform gap analysis & Fixing performance issues, etc.
Document all ETL mappings, sessions and workflows.
Ticket handling and problem ticket analysis skills in Agile /POD approach.