Data Engineer

Overview

Hybrid
Depends on Experience
Contract - W2
Contract - 12 Month(s)

Skills

Data Engineer
ETL Developer
Snowflake
StreamSets
Informatica
Python
PL/SQL
T-SQL
Oracle
SQL Server
CDC
Salesforce
Data Pipelines
Cloud Data Platform
Data Integration

Job Details

Title: Data Engineer Location: Hybrid Okemos, MI (Onsite 2 3 Days per Week) Interview Type: In-Person Interview Required Experience: 3+ Years

Duration: Long-Term Contract

Open only on W2

Client is seeking a Data Engineer to join our growing Data Engineering Team. The ideal candidate will have at least 3 years of hands-on experience building and maintaining data pipelines, ETL workflows, and enterprise data warehouse (EDW) solutions.

You ll play a key role in developing change data capture (CDC) mechanisms to migrate and transform data from on-premises systems to cloud platforms like Snowflake, ensuring data is accessible and reliable for enterprise consumption.

Responsibilities:

  • Participate in the design, development, and testing of Data Engineering components.
  • Build, configure, and maintain automated data pipelines and ETL workflows that transfer and transform data from on-prem sources to cloud destinations (e.g., Snowflake, Salesforce).
  • Support and enhance enterprise data standards, identifying and addressing process gaps.
  • Work closely with the Lead Data Engineer, BI Architect, and Architecture Team to ensure alignment with design and technology best practices.
  • Create and maintain stored procedures, functions, and scripts for automated data movement and transformation.
  • Support data pipeline monitoring, troubleshooting, and continuous improvement.
  • Participate in 24/7 on-call rotation for production support.
  • Collaborate across business and technical teams to deliver scalable, efficient data solutions.

Required Skills & Qualifications:

  • Database Platforms: Snowflake, Oracle, SQL Server
  • Operating Systems: RedHat Enterprise Linux, Windows Server
  • Languages & Tools:
    • PL/SQL, Python, T-SQL
    • StreamSets, Informatica PowerCenter, Informatica IICS or IDMC
    • Snowflake Cloud Data Platform
  • Experience developing and maintaining ETL processes using Salesforce as a destination.
  • Strong analytical, troubleshooting, and problem-solving skills.
  • Excellent communication and collaboration abilities.
  • Proven ability to automate repeatable processes and optimize data workflows.

Desired Skills & Experience:

  • Experience creating and managing Snowflake solutions involving internal file stages, procedures, tasks, and dynamic tables.
  • Hands-on experience with real-time or near-real-time data pipelines.
  • Familiarity with data streaming/pipelining tools such as StreamSets, Fivetran, Striim, or Airbyte.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Black Rock Group