Senior Data Integration & Data Quality Analyst-McLean, VA - 100% Onsite-Need Locals Only

Mc Lean, VA, US • Posted 1 hour ago • Updated 1 hour ago
Contract W2
On-site
Depends on Experience
Fitment

Dice Job Match Score™

📊 Calculating match score...

Job Details

Skills

  • Advanced SQL
  • Snowflake
  • Db2
  • Data Integration
  • Data Quality
  • Data Profiling
  • ETL Analysis
  • ETL Reverse Engineering
  • Data Warehousing (EDW/ODS)
  • MongoDB
  • Source-to-Target Mapping
  • Data Lineage
  • Dimensional Modeling
  • Python (Pandas
  • SQLAlchemy)
  • Snowflake Cortex
  • DataStage
  • Informatica IICS
  • Talend
  • SSIS
  • dbt
  • Data Validation
  • Data Reconciliation
  • SCD Type 1/2
  • Collibra
  • AWS (S3
  • Glue
  • Lambda)
  • Data Governance
  • CDC
  • Mortgage Domain
  • Financial Services

Summary

Job Title: Senior Data Integration & Data Quality Analyst

McLean, VA - 100% Onsite

Duration: Long Term

Senior Data Integration Analyst SQL, Snowflake, Db2, Data Quality, ETL, MongoDB

Description:

We are seeking a Senior Data Integration & Data Quality Analyst with deep expertise in advanced SQL, data profiling, and end-to-end data integration/ETL analysis. The ideal candidate will reverse engineer legacy pipelines and ensure strong data quality across enterprise data platforms including Snowflake, Db2, and MongoDB. Experience with Snowflake Cortex is desired to support AI-assisted analytics, automation, and data quality use cases. Python proficiency is strongly preferred to accelerate automation and validation. Experience in the mortgage or financial domain is highly desirable.

Qualifications:

Required:

  • 5+ years of experience in data integration, data quality, data warehousing, or related roles.
  • Expert-level SQL with strong experience in Snowflake and Db2 within EDW/ODS environments.
  • Experience integrating and analyzing data from MongoDB (document structures, nested fields, schema drift considerations).
  • Proven experience with data profiling and data quality analysis.
  • Demonstrated ability to reverse engineer ETL/data pipelines and document transformation logic from existing jobs and code.
  • Strong experience producing source-to-target mappings and data lineage documentation.
  • Experience with data modeling concepts in enterprise data warehouses (e.g., dimensional modeling, defining grain, keys, relationships, and conformed dimensions).
  • Excellent analytical skills and strong technical documentation/writing ability; comfortable working with incomplete or legacy documentation.

Preferred:

  • Snowflake Cortex experience (or comparable AI/LLM capabilities within a data platform) applied to analytics, automation, or documentation workflows.
  • Python proficiency (e.g., Pandas, SQL Alchemy; plus, cloud/utility libraries as needed).
  • ETL / Data Integration Tools: IBM DataStage, Informatica IICS, Talend, Nexus EBM, SSIS, dbt, or similar.
  • Cloud familiarity: AWS (Lambda, S3, Glue) or Azure/Google Cloud Platform equivalents.
  • Governance/metadata tools: Collibra; diagramming tools such as draw.io, Lucidchart, or Erwin.
  • Mortgage domain experience.

Responsibilities:

  • Develop and optimize complex SQL for profiling, validation, reconciliation, anomaly investigation, and root-cause analysis across Snowflake, Db2, and MongoDB; build reusable query assets and repeatable validation patterns.
  • Data Profiling & Data Quality: Perform profiling for completeness, uniqueness, format conformance, outliers, and referential integrity; document data quality issues and recommend remediation strategies.
  • ETL / Data Integration Reverse Engineering: Analyze and reverse engineer existing ETL/data integration pipelines (ETL tools, stored procedures, scripts) to reconstruct transformation logic, dependencies, and embedded business rules especially where documentation is missing.
  • Mapping & Lineage Documentation: Produce detailed source-to-target mappings including column-level lineage, transformation logic, business rules, and handling for incremental loads and SCD Type 1/2 where applicable.
  • Data Modeling: Partner with data engineering and analytics teams to design and refine data models for ODS/EDW and downstream consumption, including dimensional and normalized models.
  • Python Automation: Build scripts and utilities to automate profiling, reconciliation, ETL validation/testing, lineage extraction, file parsing, audit trail generation, and incremental load checks.
  • Snowflake Cortex: Apply Cortex capabilities to accelerate data understanding and quality workflows in alignment with governance and security standards.
  • Governance & Standards: Support enterprise standards (naming, typing, null handling, audit columns), contribute profiling/lineage artifacts to governance processes, and assist with traceability patterns and CDC/audit logging approaches.
  • Collaboration: Partner with data engineering, BI/reporting, governance, and business stakeholders to validate logic, confirm requirements, and support modernization/migration initiatives.

Regards,

Sai Srikar

Email:

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91081414
  • Position Id: 8924219
  • Posted 1 hour ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

McLean, Virginia

Today

Easy Apply

Third Party, Contract

Depends on Experience

McLean, Virginia

Yesterday

Easy Apply

Contract, Third Party

Depends on Experience

McLean, Virginia

Today

Easy Apply

Full-time

USD 50.00 - 60.00 per hour

Hybrid in Fairfax, Virginia

Today

Easy Apply

Contract

Depends on Experience

Search all similar jobs