Data Engineer

  • Ann Arbor, MI
  • Posted 8 hours ago | Updated 8 hours ago

Overview

Remote
On Site
USD 120,000.00 - 140,000.00 per year
Full Time

Skills

Analytics
Real-time
Legacy Systems
Apache Spark
Apache Parquet
SFTP
Data Quality
Data Deduplication
Data Integration
Dashboard
Unity
Stored Procedures
Data Processing
ELT
Management
Data Extraction
Python
C#
Database
Extract
Transform
Load
Microsoft SSIS
Microsoft SSRS
Microsoft Power BI
API
ASP.NET
Web API
RESTful
Cloud Computing
SQL Azure
Storage
Roadmaps
Apache Kafka
Data Integrity
Access Control
Regulatory Compliance
JIRA
Confluence
Microsoft Visio
Microsoft SQL Server
Transact-SQL
Performance Tuning
Query Optimization
PySpark
SQL
FOCUS
Data Lake
Streaming
Problem Solving
Conflict Resolution
Debugging
Collaboration
Communication
Documentation
Computer Science
Databricks
Microsoft Azure
Big Data
Git
Continuous Integration
Continuous Delivery
Workflow
Agile
Mortgage Servicing
Finance
Accounting
Marketing
Legal
Customer Support
Online Training
Artificial Intelligence
Insurance
.NET

Job Details

Description

Our client is undergoing a major digital transformation, shifting toward a cloud-native, API-driven infrastructure. They're looking for a Data Engineer to help build a modern, scalable data platform that supports this evolution. This role will focus on creating secure, efficient data pipelines, preparing data for analytics, and enabling real-time data sharing across systems.

As the organization transitions from older, legacy systems to more dynamic, event-based and API-integrated models, the Data Engineer will be instrumental in modernizing the data environment-particularly across the bronze, silver, and gold layers of their medallion architecture.

Key Responsibilities:
  • Design and deploy scalable data pipelines in Azure using tools like Databricks, Spark, Delta Lake, DBT, Dagster, Airflow, and Parquet.
  • Build workflows to ingest data from various sources (e.g., SFTP, vendor APIs) into Azure Data Lake.
  • Develop and maintain data transformation layers (Bronze/Silver/Gold) within a medallion architecture.
  • Apply data quality checks, deduplication, and validation logic throughout the ingestion process.
  • Create reusable and parameterized notebooks for both batch and streaming data jobs.
  • Implement efficient merge/update logic in Delta Lake using partitioning strategies.
  • Work closely with business and application teams to gather and deliver data integration needs.
  • Support downstream integrations with APIs, Power BI dashboards, and SQL-based reports.
  • Set up monitoring, logging, and data lineage tracking using tools like Unity Catalog and Azure Monitor.
  • Participate in code reviews, design sessions, and agile backlog grooming.

Additional Technical Duties:
  • SQL Server Development: Write and optimize stored procedures, functions, views, and indexing strategies for high-performance data processing.
  • ETL/ELT Processes: Manage data extraction, transformation, and loading using SSIS and SQL batch jobs.

Tech Stack:
  • Languages & Frameworks: Python, C#, .NET Core, SQL, T-SQL
  • Databases & ETL Tools: SQL Server, SSIS, SSRS, Power BI
  • API Development: ASP.NET Core Web API, RESTful APIs
  • Cloud & Data Services (Roadmap): Azure Data Factory, Azure Functions, Azure Databricks, Azure SQL Database, Azure Data Lake, Azure Storage
  • Streaming & Big Data (Roadmap): Delta Lake, Databricks, Kafka (preferred but not required)
  • Governance & Security: Data integrity, performance tuning, access control, compliance
  • Collaboration Tools: Jira, Confluence, Visio, Smartsheet


Requirements

Skills & Competencies:
  • Deep expertise in SQL Server and T-SQL, including performance tuning and query optimization
  • Strong understanding of data ingestion strategies and partitioning
  • Proficiency in PySpark/SQL with a focus on performance
  • Solid knowledge of modern data lake architecture and structured streaming
  • Excellent problem-solving and debugging abilities
  • Strong collaboration and communication skills, with attention to documentation

Qualifications:
  • Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience)
  • 5+ years of experience building data pipelines and distributed data systems
  • Strong hands-on experience with Databricks, Delta Lake, and Azure big data tools
  • Experience working in financial or regulated data environments is preferred
  • Familiarity with Git, CI/CD workflows, and agile development practices
  • Background in mortgage servicing or lending is a plus


Technology Doesn't Change the World, People Do.

Robert Half is the world's first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies. We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles.

Robert Half works to put you in the best position to succeed. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity - whenever you choose - even on the go. Download the Robert Half app and get 1-tap apply, notifications of AI-matched jobs, and much more.

All applicants applying for U.S. job openings must be legally authorized to work in the United States. Benefits are available to contract/temporary professionals, including medical, vision, dental, and life and disability insurance. Hired contract/temporary professionals are also eligible to enroll in our company 401(k) plan. Visit roberthalf.gobenefits.net for more information.

2025 Robert Half. An Equal Opportunity Employer. M/F/Disability/Veterans. By clicking "Apply Now," you're agreeing to Robert Half's Terms of Use.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Robert Half