SDET - PySpark & ETL Automation Engineer(ICEDQ)

Overview

Remote
$40 - $60
Contract - Independent
Contract - W2
Contract - 1 Year(s)

Skills

SDET
Software Development Engineer in Test
AWS
PySpark
QuerySurge
ICEDQ
automation
ETL testing
hadoop
bigdata
big data
API testing
Jenkins
CI/CD automation
SQL queries
QA Engineer
data warehousing concepts
Informatica
Glue
Redshift
EMR
S3

Job Details

Job Title: SDET - PySpark & ETL Automation Engineer(ICEDQ) Location: Remote Contract Type: C2C
Job Overview:
We are looking for a highly skilled SDET (Software Development Engineer in Test) with strong experience in ETL testing, automation using ICEDQ or QuerySurge, PySpark, and AWS. The ideal candidate will have a solid understanding of data pipelines, software testing, and cloud-based ETL processes. This is a remote contract position requiring hands-on technical expertise and problem-solving ability.
Key Responsibilities:

  • Design, develop, and maintain automated test cases for ETL pipelines using tools like QuerySurge or ICEDQ
  • Perform end-to-end ETL validation and testing in Informatica or similar platforms
  • Write complex SQL queries to validate data integrity, transformations, and business logic
  • Develop and execute tests using PySpark to validate large-scale data processing jobs
  • Conduct API testing for data services and backend systems
  • Use GitHub for version control and Jenkins for CI/CD automation
  • Work in Unix/Linux environments for data file manipulation and batch testing
  • Collaborate with data engineers and QA team to define test strategies, test plans, and defect management

Required Skills:

  • 5+ years of experience as an SDET or QA Engineer
  • Proficient in ETL testing with tools like ICEDQ or QuerySurge
  • Strong knowledge of SQL and data warehousing concepts
  • Hands-on with PySpark for testing data pipelines
  • Experience with Informatica or equivalent ETL tools
  • Proficiency in Python, API testing, and test automation frameworks
  • Familiarity with GitHub, Jenkins, and Unix/Linux
  • Knowledge of AWS Services related to data (e.g., S3, Glue, Redshift, EMR) is a strong plus
  • Good understanding of software design, testing techniques, and test design principles

Nice to Have:

  • Experience with big data platforms and distributed testing
  • Familiarity with Agile methodologies and BDD/TDD
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.