Data Engineer - Level 2

Overview

Hybrid
Depends on Experience
Contract - W2

Skills

Informatica
ETL
SQL
PL/SQL
Microsoft SQL Server
Oracle
Teradata
Udeploy
Artifactory
Azure
AWS
GCP
Hadoop
Hive
Spark
Python
Scala

Job Details

Hi There,

Please note that this position is only open to candidates who can work on a W2 basis. No C2C or 1099 arrangements will be considered.

Position Title: Data Engineer (Level 2)
Position Location: Tech Hubs
Provide locations/flexible work by preference:
Most preferred - Pittsburgh PA
All other locations secondary -

  • Cleveland OH - Strongsville,

  • Birmingham AL

  • North Texas Market TX

  • Phoenix - Biltmore AZ

Ability to work remote: Hybrid - 3/2
Travel: Not regular or frequent
Days of the week: M-F
Working Hours (Flexible): 8-5 pm, 40 hours
Acceptable time zone(s): EST hours
OT: Very rarely
Target Start Date: 7/28/25
Intended length of Assignment: 12/31/25
Potential for Contract Extension: No
Conversion Potential: Yes contingent worker must be open to converting to a full-time employee
Sponsorship: No work visa sponsorship upon conversion

Function of the group: Supports Enterprise Profitability

Current initiatives/project(s): Continuous support of backend functionalities

Industry background required: Banking/Financial background is required

Team dynamic: 5-7 member technology team, working closely with Finance Teams

Roles and Responsibilities:

  • 80% hands-on data engineering, 20% innovation

  • Collaborate with product owners and analysts to gather requirements

  • Analyze and evaluate requirements and provide scope recommendations

  • Write and optimize code for data pipelines to handle large-scale financial datasets

  • Build and optimize data architectures with performance, reliability, and security in mind

  • Propose and design data solutions for complex business needs

  • Develop technical specifications and conduct hands-on coding/testing

  • Participate in code reviews and testing cycles

Technical Skills Required (3+ years):

  1. Strong knowledge of data pipelines and ETL tool Informatica

  2. Intermediate to advanced SQL and PL/SQL skills

  3. Strong understanding of databases Microsoft SQL Server, Oracle, Teradata

  4. Proficiency with code deployment tools such as Udeploy, Artifactory

Flex (Nice-to-Have) Skills:

  • Experience in cloud environments: Azure, AWS, Google Cloud Platform

  • Big data tools: Hadoop, Hive, Spark

  • Programming languages: Python, Scala

Soft Skills:

  • Strong communication and collaboration

  • Proactive, detail-oriented, ownership-driven

  • Creative problem-solver for complex data challenges

  • Mentorship skills to guide junior engineers

Degrees/Certifications:

  • Experience can be considered in lieu of a degree

  • Preferred: Database certifications, Informatica certification

Interview Process:

  1. Manager Interview Video Soft skills + High-Level Tech (45 min)

  2. Team Interview Video Technical Assessment (1 hour)

  3. Leadership Interview Video (45 min)

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.