QA/QC Data Validation Consultant

Overview

Remote
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

Data Validation Consultant
QA
QC
Alteryx
Informatica IDMC
Snowflake
Databricks
ETL

Job Details

Job Title: QA/QC Data Validation Consultant

Job Location: (100% Remote)

Job Duration: 12 Months contract with a possibility of extension

Job Description:

  • Data Validation: Analyze and validate data for accuracy, completeness, and consistency.
  • Quality Assurance: Develop and implement data quality standards and processes.
  • Issue Resolution: Identify and resolve data quality issues, ensuring compliance with regulations.
  • Collaboration: Work with IT and data management teams to design and implement data strategies.
  • Reporting: Generate data reports and presentations for management.

Proficiency in data analysis tools, statistical software, and databases.

  • Alteryx:
    • Proficiency in building workflows for data preparation, blending, and analysis.
    • Familiarity with Alteryx Designer for creating repeatable workflows.
    • Knowledge of Alteryx Server for deploying and managing workflows.
    • Experience with Alteryx Connect for metadata management.

  • Informatica IDMC (Intelligent Data Management Cloud):
    • Expertise in creating and managing data integration mappings.
    • Familiarity with Pushdown Optimization (PDO) for ELT processes.
    • Knowledge of hierarchical schema configuration and JSON data parsing.
    • Experience with cloud data integration and managing connections (e.g., AWS S3, Snowflake).

  • Snowflake:
    • Understanding of Snowflake's architecture, including virtual warehouses and data sharing.
    • Proficiency in SQL for querying and managing data in Snowflake.
    • Experience with Snowflake's data integration capabilities, such as loading data from external sources.
    • Familiarity with Snowflake's security features, including role-based access control.
  • Databricks:
    • Knowledge of Delta Lake for managing structured and unstructured data.
    • Proficiency in Apache Spark for big data processing.
    • Experience with Databricks notebooks for data engineering and machine learning tasks.
    • Familiarity with Databricks' integration with cloud platforms like Azure and AWS.
  • ETL (Extract, Transform, Load):
    • Expertise in designing and implementing ETL pipelines for data migration and transformation.
    • Familiarity with data cleansing and validation techniques.
    • Knowledge of scheduling and monitoring ETL workflows.
    • Understanding of ELT processes, where transformations occur within the target system.
    • Experience with tools that support ELT, such as Snowflake and Databricks.
    • Proficiency in optimizing query performance for large-scale data transformations.
    • These skills will help ensure alignment with the technical requirements of these tools and platforms.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About American IT Systems