Overview
Skills
Job Details
Title: QA/QC Data Validation Consultant
Location: Remote
Contract: 6+ Months
Must be willing to use your own laptop
Must be willing to work in PST Hours
Responsibilities
- Data Validation: Analyze and validate data for accuracy, completeness, and consistency.
- Quality Assurance: Develop and implement data quality standards and processes.
- Issue Resolution: Identify and resolve data quality issues, ensuring compliance with regulations.
- Collaboration: Work with IT and data management teams to design and implement data strategies.
- Reporting: Generate data reports and presentations for management.
Skills | No. of Years of Experience | Detailed write up |
Total No. of Years of experience |
|
|
No. of Years of Experience as a QA/QC Data Validation Consultant |
|
|
Proficiency in data analysis tools, statistical software, and databases. |
|
|
Alteryx: Proficiency in building workflows for data preparation, blending, and analysis. Familiarity with Alteryx Designer for creating repeatable workflows. Knowledge of Alteryx Server for deploying and managing workflows. Experience with Alteryx Connect for metadata management. |
|
|
Informatica IDMC (Intelligent Data Management Cloud): Expertise in creating and managing data integration mappings. Familiarity with Pushdown Optimization (PDO) for ELT processes. Knowledge of hierarchical schema configuration and JSON data parsing. Experience with cloud data integration and managing connections (e.g., AWS S3, Snowflake). |
|
|
Snowflake: Understanding of Snowflake's architecture, including virtual warehouses and data sharing. Proficiency in SQL for querying and managing data in Snowflake. Experience with Snowflake's data integration capabilities, such as loading data from external sources. Familiarity with Snowflake's security features, including role-based access control. |
|
|
Databricks: Knowledge of Delta Lake for managing structured and unstructured data. Proficiency in Apache Spark for big data processing. Experience with Databricks notebooks for data engineering and machine learning tasks. Familiarity with Databricks' integration with cloud platforms like Azure and AWS. |
|
|
ETL (Extract, Transform, Load): Expertise in designing and implementing ETL pipelines for data migration and transformation. Familiarity with data cleansing and validation techniques. Knowledge of scheduling and monitoring ETL workflows. Understanding of ELT processes, where transformations occur within the target system. Experience with tools that support ELT, such as Snowflake and Databricks. Proficiency in optimizing query performance for large-scale data transformations. These skills will help ensure alignment with the technical requirements of these tools and platforms. |
|
|