Overview
Skills
Job Details
We are seeking a skilled ETL / Big Data Tester experienced in testing data pipelines, ETL workflows, and analytical data models across large-scale distributed systems. The role involves hands-on work with Databricks, PySpark, and SQL to ensure data accuracy, integrity, and performance. Experience in data lake / Delta Lake testing, automation, and cloud platforms (AWS, Azure, or Google Cloud Platform) is essential.
Required Qualifications:
8+ years in data testing / QA for enterprise data systems.
5+ years in ETL / Big Data pipeline testing.
4+ years in Databricks and PySpark (Data Frame APIs, UDFs, transformations).
Strong SQL skills for data validation (joins, aggregations, window functions).
3+ years in Delta Lake / data lake testing.
3+ years in Python scripting for test automation.
Experience with cloud data platforms (Azure, AWS, or Google Cloud Platform).
Solid knowledge of data warehousing, modeling (Star/Snowflake), and QA frameworks.
Experience working in Agile / SAFe environments.