SR Data Engineer / Informatica Developer

Overview

Hybrid
Depends on Experience
Contract - W2
Contract - 6 Month(s)

Skills

Informatica Power Center
SQL
AWS for cloud
Snowflake for datalakes
dental/healthcare insurance industry

Job Details

  1. What s the major objective(s) of the role? Specifically, what does this person need to do to be considered a success? What will they be working on?
  • Building and optimizing ETL processes using Informatica PowerCenter to extract, transform, and load data from multiple sources into centralized data warehouses or cloud environments.
  • Ensuring data quality, consistency, and accuracy across systems by implementing validation, cleansing, and transformation logic.
  • Developing and optimizing SQL queries for efficient data retrieval, analysis, and reporting.
  • Leveraging cloud platforms (such as AWS, Azure, or Google Cloud Platform) to design scalable, secure, and cost-effective BI solutions.
  • Collaborating with business stakeholders to understand reporting and analytics needs, then translating them into technical solutions.
  • Enabling self-service analytics by delivering structured data models, dashboards, and reporting frameworks for end-users.
  1. What are the MUST-HAVE technologies for this position?

(Please list must-have technologies / technical skills and what the candidate needs to do to be considered great at them)

  • ETL / Data Integration Tools
    • Informatica PowerCenter (core requirement for ETL design and data integration)
    • Informatica Cloud (for hybrid/cloud data integration, if applicable)
    • Databases and Query Languages
  • SQL (advanced proficiency for writing complex queries, stored procedures, and performance tuning)
    • Relational Databases such as Oracle, SQL Server, PostgreSQL, or MySQL
    • Exposure to Data Warehousing concepts (Star/Snowflake schema, fact/dimension modeling)
  • Cloud Platforms (at least one major provider)
    • Azure (Synapse Analytics, Data Factory, Blob Storage)
  • Data Modeling & Warehousing
    • Dimensional modeling
    • Data warehouse/lakehouse platforms (Snowflake, Databricks, or equivalent)
  1. What are the MUST-HAVE Critical Skills for this position

(For critical skills, please also describe what the person needs to do with them to be considered very good at it.)

  • Performance Optimization
    • Experience in tuning ETL jobs and optimizing SQL queries for large data volumes.
    • Ensuring data pipelines are efficient, reliable, and scalable.
  • Data Quality & Governance
    • Implementing data validation, cleansing, and transformation rules.
    • Understanding of data security, compliance, and governance best practices.
  • Problem-Solving & Analytical Thinking
    • Strong skills in analyzing business requirements and translating them into technical solutions.
    • Ability to troubleshoot complex ETL, SQL, and data pipeline issues.
  • Collaboration & Communication
    • Ability to work closely with business stakeholders to understand reporting needs.
    • Clear communication of technical concepts to non-technical users.
  • Adaptability & Continuous Learning
    • Keeping up with evolving cloud technologies and BI tools.
    • Flexibility to work across different databases, integration tools, and visualization platforms

  1. What are the NICE TO HAVE technologies you wouldn t mind seeing on a candidate s resume?
  • Advanced Cloud Data Ecosystem
    • Azure: Data Factory, Databricks, Cosmos DB
    • Snowflake or Databricks for modern data warehousing and lakehouse solutions
  • BI & Visualization Tools
    • Power BI, Tableau, Qlik, or Looker for dashboarding and self-service analytics
  • Programming & Scripting Languages
    • Python or R for data manipulation, automation, and advanced analytics
    • Shell scripting for workflow automation and ETL orchestration
  • Data Governance & Quality Tools
    • Collibra, Alation, or Informatica Data Quality (IDQ) for metadata management and governance
    • Master Data Management (MDM) tools for enterprise data consistency

Other Pertinent Information - Responsibilities, Skills, Qualifications, etc.

  • ETL Tools: Informatica Power Center, IICS, Informatica Developer IDQ, SSIS, Metadata Manager
  • Cloud & Infrastructure: AWS (Redshift, S3, Lambda, Glue, Kinesis), dbt Cloud
  • Languages & Tools: Python, SQL, PySpark, DBT, Pandas
  • Languages & Querying: T-SQL, Dynamic SQL, PL/SQL
  • Databases: Microsoft SQL Server, Azure SQL, Oracle, PostgreSQL
  • Data Warehousing: Redshift, Snowflake, Delta Lake, Apache Iceberg
  • Data Engineering: Spark, Airflow, Airbyte, Stitch, Kafka
  • Reporting Tools: SQL Server Reporting Services (SSRS), Power BI,
  • Domains: Healthcare, Public Sector, Insurance, Retail, Finance

NOTES FROM CALL WITH HIRING MANAGER:

Position: Data Engineer, SR using Informatica Power Center ETL to build data pipelines, SQL

Also looking for someone who can solve performance challenges. Not looking for someone who s just good at technology or someone who s just about the tool, but instead how we do it.

Building pipeline solutions building new hierarchy

Work will be 80% building pipelines, 10% meetings & 10% challenges

Cloud platform preference is AWS, Azure would be ok too. Datalake is Snowflake

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.