Overview
Skills
Job Details
Job Title: ETL and Data Warehouse Test Engineer
Location: Open for remote
Duration: 12 months contract (Possibility to convert into full-time) Note: This position is only on W2 and is not eligible for visa sponsorship & visa dependents candidates.
Overview:
The ETL and Data Warehouse Test Engineer is responsible for designing, developing, and executing test plans and test cases to ensure the quality and accuracy of data flows and data transformations in our data warehouse environment. This role focuses on validating ETL processes, data integrity, and data warehouse components using both automated and manual testing techniques. The ideal candidate has hands-on experience with Azure platforms, the Snowflake platform, and Talend, and a strong understanding of regression testing to support continuous integration and deployment within a cloud-based environment.
Core Responsibilities:
Test Planning and Strategy: Develop and execute comprehensive test plans, test cases, and test scripts for ETL processes, data warehouse structures, and data integrations.
ETL and Data Warehouse Testing: Conduct end-to-end testing of ETL processes to ensure data accuracy, completeness, and transformations in line with business requirements.
Regression Testing: Design and perform regression testing to validate existing functionality and data pipelines, ensuring the stability of the data environment across updates.
Platform Expertise: Use Azure Data Services (Azure Data Factory, Azure Synapse Analytics, etc.) and the Snowflake platform to validate data pipelines, transformations, and warehousing solutions.
Perform tasks spanning the full lifecycle of data management activities including but not limited to defining completeness, accuracy, and consistency specifications by data element, writing scripts, and developing tools to monitor quality, and building defining and implementing controls for key data quality measures.
Automation and Scripting: Create and maintain automated test scripts and frameworks for ETL processes and data validation, leveraging tools like Talend for ETL job automation
Data Quality and Integrity Checks: Perform data quality, integrity, and accuracy checks, identifying and troubleshooting data discrepancies and implementing corrective actions.
Collaboration: Work closely with data engineers, data architects, and business analysts to understand requirements, data flows, and business logic, ensuring thorough testing coverage.
Reporting and Documentation: Document and report defects, issues, and enhancements; track progress on defect resolution and update stakeholders on testing status.
Skills Qualifications:
Strong understanding and experience in development activities for all aspects of the Software Development Life Cycle (SDLC). Data Vault Methodologies
Excellent problem-solving and critical-thinking skills
Effective communication skills with an ability to explain technical concepts to developers, product managers, and business partners.
Knowledge and experience with database design principles including referential integrity normalization, and indexing to support application development
Education:
Bachelor's degree or master's degree in a quantitative field such as Computer Science and Information Systems, Database Management, Big Data, Data Engineering, Data Science, Applied Math, etc.
5+ years of professional data test engineering working experience.
Preferred:
3+ years of experience working with large data and a variety of data sources.
Experience in Data Vault 2.0 methodology
Experience working in virtualized cloud environments including cloud-based IaaS/SaaS/PaaS solutions
Experience in Power BI, and SQL