Data Quality Engineer

Overview

On Site
Hybrid
55 - 65
Contract - W2
Contract - 6 Month(s)
75% Travel
Unable to Provide Sponsorship

Skills

Amazon Redshift
Amazon S3
Amazon Web Services
BREW
Cloud Computing
Data Analysis
Data Engineering
Data Governance
Data Integrity
Data Modeling
Data Profiling
Data Quality
Data Warehouse
ELT
Extract, Transform, Load
Orchestration
Performance Tuning
Python
Reporting
SQL
Scripting
Snow Flake Schema
Health Care

Job Details

Sr. Data Quality Engineer

(6 month contract-to-hire)

hybrid (2-3 days / week onsite in Eagan, MN). 

The selected person MUST be LOCAL (currently living in Minnesota – preferably near Eagan, MN)

 U.S. CITIZEN (preferred) or status and serious about wanting to convert to Full-Time Employment. 

Strong communication skills are expected.
Someone who is Senior-level is needed:

Details:
SQL
Python
Key AWS data services:
AWS Glue
Amazon Redshift
AWS Data Brew
Healthcare experience is preferred

Job Description:
We are seeking an experienced and meticulous Sr Data Quality Engineer to ensure the reliability, accuracy, and completeness of our critical data assets. This role is essential for maintaining data integrity and building business trust by developing robust testing frameworks and implementing continuous data quality monitoring across our cloud-based data platform. The ideal candidate will possess deep expertise in SQL, Python, and key AWS data services, particularly AWS Glue, Amazon Redshift, and AWS Data Brew.

Key Technical Responsibilities
The Data Quality Engineer will be responsible for the following:
I. Data Quality Framework Development & Automation
• Design, develop, and maintain end-to-end data quality frameworks using Python to automate testing, validation, and analysis of data pipelines and data warehouse tables.
• Build and implement custom data quality checks (e.g., uniqueness, completeness, validity, consistency, timeliness) and anomaly detection scripts within the automated framework.

• Integrate data quality checks directly into CI/CD pipelines to prevent poor-quality data from reaching production environments.

• Develop reporting mechanisms and dashboards to track and visualize key Data Quality Metrics (e.g., completeness, accuracy rates, latency, and compliance adherence) for business stakeholders.

II. Data Platform Expertise & Implementation

• Leverage expertise in AWS Glue for building and implementing data transformation jobs and embedding data quality rules directly into ETL/ELT processes.

• Utilize AWS Data Brew for profiling, cleaning, and normalizing datasets, ensuring data readiness before consumption.

• Design and execute performance-optimized data quality checks and validation queries directly on Amazon Redshift data warehouse tables.

• Work with other relevant AWS technologies (e.g., S3, Lambda, CloudWatch) to build scalable and resilient data quality solutions.

III. Data Analysis & Validation

• Write highly complex, efficient, and optimized SQL queries for in-depth data profiling, testing, validation, and root cause analysis of data quality issues.

• Perform deep-dive analysis on data sets to identify trends, patterns, and systemic data errors that impact business decision-making.

• Collaborate with Data Architects and Data Engineers to define and enforce organizational data governance and quality standards.

• Document data quality rules, validation logic, and data profiling results clearly and comprehensively.

Required Technical Skills and Experience

• 9+ years of hands-on experience in Data Quality, Data Engineering, or Data Testing role.

• Expert-level proficiency in SQL with proven experience writing complex queries for data analysis, validation, and manipulation.

• Strong programming skills in Python (or a similar language) focused on building testing frameworks, automation scripts, and data analysis utilities.

• Deep working knowledge of AWS data services, including:

   o AWS Glue (Job Development, Data Catalog)

   o Amazon Redshift (Querying, Performance Tuning)

   o AWS Data Brew (Profiling, Cleaning)

   o MWAA / Airflow (Automation & Orchestration)

• Proven ability to define, measure, and report on critical Data Quality Metrics (e.g., completeness, consistency, accuracy, uniqueness, validity, timeliness).

• Experience with data modeling concepts (e.g., star schema, snowflake) and their implications for data quality.

• Healthcare experience is preferred.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.