Informatica Developer

Overview

On Site
Full Time
Part Time
Accepts corp to corp applications
Contract - W2
Contract - Independent

Skills

Informatica PowerCenter
PySpark
Apache Spark
SQL/NoSQL databases
AWS cloud services.

Job Details

Job Description:

Client seeking a highly experienced Informatica Developer with a strong background in big data, ETL workflows, and cloud-based data platforms. The ideal candidate will have extensive hands-on experience in Informatica PowerCenter, PySpark, Apache Spark, SQL/NoSQL databases, and AWS cloud services. This role requires excellent technical skills, leadership capabilities, and a passion for problem-solving in a fast-paced Agile environment.

Key Responsibilities:

  • Design, develop, and maintain data integration solutions using Informatica PowerCenter and Power Exchange CDC tools.
  • Build scalable and efficient ETL pipelines using PySpark, Apache Spark, and Python.
  • Develop and maintain workflows utilizing Airflow or similar orchestration tools.
  • Implement data solutions across SQL and NoSQL databases including DB2, PostgreSQL, and Snowflake.
  • Optimize ETL workflows for performance, scalability, and reliability.
  • Troubleshoot data issues and provide timely resolution through deep root cause analysis.
  • Participate in Agile development processes including sprint planning, ticket handling, and peer reviews.
  • Create comprehensive documentation for ETL mappings, sessions, and workflows.
  • Collaborate with cross-functional teams including data engineers, DevOps, and QA to deliver high-quality solutions.
  • Contribute to DevOps efforts by participating in CI/CD pipelines, containerization efforts using Docker and Kubernetes.

Professional Skills:

  • 10+ years of experience in big data and distributed computing.
  • 7+ years of strong hands-on experience with Informatica PowerCenter.
  • Proficiency in PySpark, Apache Spark, and Python.
  • Hands-on expertise in SQL/NoSQL databases (DB2, PostgreSQL, Snowflake, etc.).
  • Strong understanding of ETL design, data modeling, and best practices.
  • Experience working with workflow schedulers like Apache Airflow.
  • Prior experience working with AWS cloud data services.
  • Familiarity with DevOps, CI/CD tools, and containerization technologies.
  • Excellent analytical and problem-solving skills.
  • Ability to lead, mentor, and guide junior team members.
  • Strong verbal and written communication skills.


Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.