Data Engineer (Informatica/Snowflake Cloud)

Python, data transformation, Snowflake, Snowpark, virtual warehouse sizing, query performance tuning, Informatica data tools, Informatica IICS, PowerCenter, DEI/BDM, Unit Test, System Integration Test, User Acceptance testing, designing, developing data loading processes, Oracle, flat files, Web Services, Azure Cloud., SQL, Git
Contract W2
Depends on Experience
Travel not required

Job Description

Are you a Data Engineer with experience in Python and Snowflake looking for a challenge to grow your skills? Are you ready to work for a great team within a great organization? If so, then this role may be for you!

As a Data Engineer, you will build and work on end-to-end date projects using a modern cloud tech / data stack. You’ll use your pipeline building skills to build date systems from the ground up and support internal stakeholders through modern data practices, tools and platforms like Snowflake Data Cloud, Python, Informatica IICS, and Azure. Your expertise as a data engineer will be helpful for making useful recommendations on improving design, development, and implementation.


This is a long-term Contract role, including full benefits and possibility of extension

Location: Cedar Rapids, IA


  • Designs and develops data pipelines (ELT/ETL) architecture to load data from a wide range of sources to Snowflake and Azure Data Cloud.
  • Plans, analyzes, designs, codes, tests, implements, and maintains moderately complex data projects.
  • Assists team members with design and acts as a resource for particularly difficult design projects.
  • Translate operational requirements for moderately complex systems. Recommends and may assist in leading definition of user stories and design workshops, data modeling and prototyping.
  • Performs and may lead the testing of complex systems to ensure system reliability prior to implementation.
  • Understands data pipelines and modern ways of automating data pipeline using cloud-based implementation.
  • Tests and clearly documents the requirements to create technical and functions specs.
  • Identify, design, and implement internal process improvements such as automation of manual processes, optimizing data processing, etc.
  • Help set up and maintain CI/CD pipelines.

Required Skills:

  • Bachelor’s degree in Information Technology or related field
  • 5 years of related experience (in a similar role or in a data engineering role)
  • Experience and proficiency in using Python programming in data transformation
  • Hands-on development experience with Snowflake data platform including Snowpipe, tasks, stored procedures, streams, resource monitors, Snowpark, virtual warehouse sizing, query performance tuning, cloning, time travel, data sharing
  • Solid experience in working with Informatica data tools with a focus on Informatica IICS, PowerCenter, DEI/BDM.
  • Solid experience in Unit Test, System Integration Test and User Acceptance testing.
  • Experience designing / developing data loading processes to load data from a wide range of sources such as Oracle, flat files, Web Services, and Azure Cloud.
  • Expertise in developing and implementing business logic through SQL.
  • Experience with Git or a similar version control/source code management tool


If you are interested, apply today!

Smart Solutions, Inc. is an equal opportunity employer functioning under an Affirmative Action Plan.

Dice Id : smawi001
Position Id : 11765
Originally Posted : 2 months ago
Have a Job? Post it