Overview
Remote
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 1 Year(s)
100% Travel
Skills
API
Amazon Kinesis
Amazon S3
Amazon Web Services
Analytics
Apache Kafka
Caching
Change Data Capture
Cloud Computing
Collaboration
Continuous Delivery
Continuous Integration
Data Flow
Data Integration
Data Modeling
Data Quality
Database
Databricks
DevOps
Documentation
EDC
ELT
GCS
Git
GitHub
Good Clinical Practice
Google Cloud Platform
Informatica
Information Security Governance
Jenkins
MASS
Management
Mapping
Microsoft Azure
Network Layer
Orchestration
PL/SQL
Performance Tuning
Python
Real-time
Replication
Root Cause Analysis
SAP
SCD
SOAP
SQL
SaaS
Salesforce.com
Scala
Snow Flake Schema
Software Design
Specification Gathering
Storage
Streaming
Use Cases
Warehouse
Job Details
Title: Informatica CDI Developer Location: Remote Duration: Long Term on W2/1099 Experience: 8+ Years
Role Summary:
Build and optimize cloud-native data pipelines using Informatica Cloud Data Integration (CDI) to move, transform, and synchronize data across on-prem and cloud platforms (AWS/Azure/Google Cloud Platform). Partner with data engineers, platform teams, and business stakeholders to deliver reliable, secure, and performant data flows for analytics and operational use cases.
Key Responsibilities
- Design, develop, and maintain IICS/CDI mappings, tasks, taskflows, and
- parameterized re-usable components.
- Implement batch and near real-time integrations (file, DB, API, streaming),
- including CDC from sources like Oracle, SQL Server, Snowflake, Salesforce,
- SAP, etc.
- Build Mass Ingestion and Replication jobs; configure agents, connections,
- secure gateways, and runtime environments.
- Optimize performance: pushdown, partitions, dynamic mapping, caching, error
- handling/retry, and workload orchestration.
- Implement data quality checks (DQ rules, profiling, validations) and integrate
- with monitoring/alerting (Operational Insights, Cloud Monitoring, custom
- dashboards).
- Enforce security & governance: parameter files, secrets management, role-
- based access, PII masking, lineage documentation.
- Integrate pipelines with CI/CD (Git-based versioning, branching, deploy via IICS
- APIs/CI tools).
- Collaborate with architects on solution design (layered zones, SCD/CDC
- patterns, ELT to Snowflake/Databricks/BigQuery/Synapse).
- Produce technical docs (design specs, runbooks) and provide L3 support, root-
- cause analysis, and remediation.
- Contribute to standards, code reviews, and re-usable frameworks/accelerators.
Core Qualifications
- Hands-on IICS/CDI (mappings, taskflows, parameterization, pushdown, partitions) 5+ years minimum.
- Strong SQL and one major cloud warehouse
- (Snowflake/Databricks/BigQuery/Synapse).
- Experience with CDC (logs/triggers), file/object storage (S3/ADLSS), and
- REST/SOAP connectors.
- Proficient with Git and CI/CD (Azure DevOps, GitHub Actions, Jenkins, or
- similar).
- Solid grasp of data modeling (star/snowflake), SCD types, and performance
- tuning.
Nice to Have
- Informatica Cloud Application Integration (CAI) or CDI-Elastic; Mass Ingestion Framework.
- Python/Scala for utilities, Airflow/Azure Data Factory orchestration.
- Kafka/Kinesis/Event Hub streaming patterns.
- Informatica DQ/EDC/IDMC ecosystem exposure.
- Certifications: Informatica IICS, Snowflake/Databricks, cloud (AWS/Azure/Google Cloud Platform).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.