Mandatory Qualifications:
Level III- More than seven (7) years of experience working on complex projects
with 2 or more years in a leadership role as a Developer
More than seven(7) years of experience in designing end-to-end architecture for an enterprise Data Integration hub for centralized data ingestion and
transformation for various ingestion patterns(batch and event-driven)
More than (7) years proven experience with SQL/NoSQL databases, API
development (REST/SOAP), version control (Git), and building data integrations
between on-premises & cloud systems.
Desirable Qualifications:
More than (7) years proven experience with standardizing Api
consumption, error handling, re-tires, and throttling.
More than (7) years of experience in managing schema evolution and
backward compatibility supporting legacy transformations
More than (7) years’ hands-on experience in designing and implementing
data integration hub
More than (7) years proven experience in complex API based workflows,
data governance, metadata and lineage tooling
More than (7) years of experience in integrating Informatica with
Databricks as the hubs’ transformation and curation engine
More than (7) years of experience in designing hub-based data
publishing mechanisms for data warehouses and analytics
More than (7) years proven experience with security standards (OAuth,
SAML), data governance, performance tuning, and relevant cloud/integration
certifications
More than seven(7) years of experience in event streaming platforms and cloud integration architectures
Duties/Responsibilities:
Design and Develop Service based data integration architectures.Build ELT
pipelines with Databricks using PySpark and Databricks SQL Implement batch
and incremental data loads, including CDC patterns.
Create reusable, standardized integration patterns. Establish logical and physical
integration layers, curation and distribution.
Working with cross-functional teams (developers, product managers, etc.) to
define integration requirements and ensure successful integration.
Identifying and resolving integration related issues, optimizing performance, and ensuring data security.
Creating and maintaining clear and comprehensive documentation for internal
users and stakeholders