Data Quality Technical Lead

Remote • Posted 15 hours ago • Updated 15 hours ago
Contract W2
Contract Independent
Remote
Depends on Experience
Fitment

Dice Job Match Score™

📋 Comparing job requirements...

Job Details

Skills

  • Data Quality
  • MDM
  • ETL
  • Java Script
  • Rest APIs
  • SQL
  • MDH
  • data lake
  • logate
  • address doctor

Summary

Cycle3 IT Staffing is seeking a Data Quality Technical Lead for a consulting role.

Location: Lakeville, MN Onsite Tuesday thru Thursday OPEN TO REMOTE
Duration: 6 Months to start

Team: Two Governance Analysts and a team of Item Master Data Stewards. There is also a data engineering team working with Snowflake. The new hires will fill critical gaps in MDM architecture and technical data quality expertise.

Context: Company is navigating a complex data integration effort following recent M&A activity, operating with two ERPs in a low data-maturity environment (Level 1 2). A prior Stibo STEP MDM implementation was paused, though the organization retains a long-term investment in the platform.

This role is focused on supporting M&A data conversion by leveraging existing MDM tools (Stibo STEP and Boomi MDH) to improve data quality, establish integrity controls between ERPs especially around item master data and eliminate duplicates. The objective is to deliver a practical MVP, not a full-scale MDM rollout.

The Data Quality Technical Lead will report to the Associate Director of Data Strategy & Integration and partner with an MDM Architect, Governance Analysts, and Item Master Data Stewards. While a Snowflake team exists, there is no centralized data lake. The ideal candidate brings a governance-first mindset, strong data quality expertise, and comfort operating in a low-maturity environment.

Must Have:
Experience designing end to end DQ processes for MDM and Data Warehouse projects
Strong data quality experience across core ETL concepts and tools
Solid understanding of DQ dimensions and ability to define KPIs across those dimensions
Hands-on experience in data profiling, deriving DQ roles, Scorecards and Dashboards
Strong understanding of Rest APIs and Java Script for system integration
Ability to write medium to complex SQL queries on large volume of data

Should Have:
Boomi MDH / ETL experience
Experience operating without a centralized data lake
Familiarity with Address validation services like Loqate or address doctor

Overview
Leads the end to end planning, design, and development of the Data Quality process for the 8th Ave to PCB integration across Customer and Finished Goods domains. This role is responsible for defining and implementing data profiling, cleansing, standardization, and ETL processes to convert 8th Ave data to PCB standards. The Data Quality Lead will partner with business teams, process excellence, project managers, IT, and data teams to build a scalable, automated and reusable data quality process to support current and future integrations. The role also includes developing data quality dashboards for conversion reporting and ongoing data quality monitoring.
Key Responsibilities
Profile acquired 8th Avenue Customer and Item datasets to assess completeness, validity, uniqueness, and conformity.
Communicate data quality issues to stakeholders and drive resolution through clear, actionable feedback.
Design and implement standardized data quality processes, including profiling, cleansing, transformation, and ETL/ELT workflows for Customer and Item integration built to be reusable for future integrations and aligned with existing Item Master data quality reporting tools where possible.
Establish standardized Data Quality scorecards and alerts to support ongoing integration monitoring, leveraging existing tools when applicable.
Collaborate with the MDM Architect to align data quality rules and processes with the MDM design, and support MDM governance to ensure consistent, high quality enterprise master data
Review and leverage existing MDM design and development documentation where possible
Review current Data Quality design and implementation, including both MDM and the Data.Gaps tool where possible
Provide on the job training and knowledge transfer sessions, including documentation and task delegation, for company internal team
Contribute to the enterprise data catalog, including updates to the business glossary, metadata, and lineage documentation.
Identify capability gaps, support tool evaluations, and contribute to the future state Data Quality roadmap.
Collaborate cross functionally to elicit and review business requirements related to master data quality and ensure the delivery of high quality master data.
Perform unit testing, support user acceptance testing (UAT), deployment, post production validation, and defect resolution in alignment with defined SLAs.

Deliverables
Data Quality Capability Assessment Report: Comprehensive evaluation of current state data quality processes, tools, governance, and organizational readiness.
Data Profiling & Baseline Metrics Package: Includes profiling reports, baseline data quality scores, and a consolidated issue log with severity, impact, and remediation recommendations.
Data Quality Design Document: Detailed design covering architecture, controls, validation rules, monitoring workflows, and integration points.
Documented Data Quality Rules Library & Cleansing Specifications: A complete repository of DQ rules, cleansing logic, validation criteria, and an Integration Data Quality Playbook outlining repeatable processes, standards, and escalation paths.
Standardized Customer and Item Master Data Ruleset: Includes address standardization and other normalization rules necessary for harmonized master data across systems.
Data Quality Dashboards & Monitoring Framework: Dashboards with defined alert thresholds, KPIs, and weekly trend based scorecards for both initial validation and ongoing monitoring.
Cleansed & Standardized Customer and Item Master Data: Fully transformed, quality validated datasets loaded into MDM and integrated with target systems for deployment.
Tools Evaluation & Future State Capability Roadmap: Assessment of enabling technologies with recommendations, prioritization, and a roadmap for maturing Data Quality capabilities.
Training & Knowledge Transfer Package: Includes training plan, process documentation, walkthroughs, and transition sessions for the company internal team.
Unit Test Cases & Results: Validated test cases demonstrating accuracy, completeness, and compliance of all Data Quality processes and integration workflows.
Deployment Plan & Execution Checklist: Documented deployment activities, readiness criteria, validation steps, and post deployment controls.
Incident & Defect Reports: Logs capturing defects, root causes, resolution steps, and SLA adherence for both testing and post production phases.

Skills
Must Have:
Data Quality expertise for MDM implementations, including end to end DQ process design and execution
Proficient with Data Quality tools such as Informatica and SAP Data Quality
Skilled in building DQ dashboards and designing ongoing data quality monitoring and reporting
Strong ETL experience supporting data cleansing, standardization, and transformation
Hands on proficiency in data profiling, cleansing, and standardization
Experience building reusable Data Quality rules repositories
Expertise in designing ongoing DQ reporting and monitoring frameworks
Integration experience with address validation tools (Loqate/other external address validation services)
Deep understanding of Data Quality dimensions and designing corresponding DQ metrics and reports
Technical skills including JavaScript, REST APIs, and Tableau for reporting and automation
Understanding of Data Governance and related frameworks

Good to Have:
Experience working within the Oracle JDE ecosystem
Boomi ETL experience
MDH (Master Data Hub) experience
Data.Gaps (Data Quality) experience
M&A integration experience

KPIs:
100% coverage of Data Quality scores across all applicable DQ dimensions for master data and associated attributes
100% automated data quality profiling, cleansing, validation, standardization, and ETL, enabling scalable support for current and future integrations as well as the evolving MDM data model
Post go live master data issues represent less than 1% of total integration defects
Defect resolution turnaround time consistently meets defined business SLAs

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10329512
  • Position Id: MN2Peter159
  • Posted 15 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

Today

Easy Apply

Contract

Depends on Experience

Remote

29d ago

Easy Apply

Contract

Depends on Experience

Remote

2d ago

Easy Apply

Contract

Depends on Experience

Remote

Today

Easy Apply

Contract

Depends on Experience

Search all similar jobs