Databricks Tester

  • REMOTE WORK, VA
  • Posted 1 day ago | Updated 1 hour ago

Overview

Remote
On Site
USD 80,001.00 - 120,000.00 per year
Full Time

Skills

Business Rules
Data Mapping
Extraction
Test Scripts
Extract
Transform
Load
Reporting
Flat File
Regression Analysis
System Integration Testing
Dimensional Modeling
Negative Testing
Collaboration
Performance Testing
Test Execution
Dashboard
Agile
UPS
Sprint
Continuous Improvement
Mentorship
Security Clearance
Data Engineering
ETL QA
Data Validation
Writing
Test Cases
Data Manipulation
Databricks
Workflow
Orchestration
Microsoft Azure
BMC Control-M
SQL
Query Optimization
Stored Procedures
Scripting
Python
Pandas
PySpark
Unit Testing
Java
JavaScript
Automated Testing
Selenium
TestNG
SoapUI
API QA
Data Warehouse
Snow Flake Schema
JSON
XML
Apache Parquet
Apache Avro
Computerized System Validation
Data Quality
Meta-data Management
Testing
Defect Management
JIRA
Test Management
Zephyr
TestRail
Version Control
Git
Continuous Integration
Continuous Delivery
DevOps
Problem Solving
Conflict Resolution
Analytical Skill
Debugging
Information Technology
Systems Engineering
FOCUS

Job Details

Job ID: 2509669

Location: REMOTE WORK, VA, US

Date Posted: 2025-09-11

Category: Software

Subcategory: SW Engineer

Schedule: Full-time

Shift: Day Job

Travel: No

Minimum Clearance Required: Public Trust

Clearance Level Must Be Able to Obtain: None

Potential for Remote Work: Yes

Description

We are seeking a highly skilled DataBricks Tester to join our team. The ideal candidate will have strong expertise in ETL, Databricks, workflow orchestration, SQL, Python, Pyspark/Javaspark and Java scripting. The role requires someone capable of analyzing complex data logic, extracting business rules from code, designing and executing test strategies, and automating testing processes to ensure high-quality data solutions.

Key Responsibilities:
  • Analyze business and technical requirements, ensuring complete test coverage across data mapping, data pipelines, transformations, and reporting layers.
  • Review code conversion, logic extraction, and validation of complex transformation rules from source to target to identify test coverage.
  • Develop, execute, and maintain test cases, test scripts, test data, and automation frameworks for ETL or Databricks pipelines, Databricks notebooks, and workflow orchestration jobs.
  • Validate data ingestion, transformation, aggregation, cleansing, and reporting logic against business requirements, code conversions, business logics/rules.
  • Proficiency in validating flat files (CSV, TSV, TXT, fixed-length) including delimiter handling, header validation, null value handling, and schema verification.
  • Conduct data reconciliation between source, staging, and target systems to ensure accuracy and completeness.
  • Design and implement SQL- and Python-based automation frameworks for regression, smoke, and system integration testing.
  • Test data quality dimensions such as accuracy, completeness, consistency, timeliness, and validity.
  • Perform negative testing, boundary testing, and exception handling to ensure robustness of pipelines.
  • Collaborate with developers, data engineers, architects, and business analysts to identify data gaps, defects, and performance issues.
  • Conduct performance testing of queries and transformations to identify bottlenecks and recommend optimizations.
  • Provide clear and detailed defect reports, test execution results, and testing dashboards to stakeholders.
  • Support CI/CD integration of automated test scripts into deployment pipelines.
  • Participate in Agile ceremonies (stand-ups, sprint planning, retrospectives) and contribute to continuous improvement of test processes.
  • Mentor junior team members in best practices for data testing and automation.

Qualifications

Required Skills & Qualifications:
  • BS degree in related field (4 years of experience in lieu of degree).
  • Active IRS MBI Clearance.
  • 5+ years of experience in data engineering or data testing roles.
  • Proven experience with ETL testing and data validation in large-scale enterprise environments.
  • Strong in creating test cases, writing SQL/Python scripts to validate test cases for data manipulation, validations, Reports and Files validations, Comparisons Validation.
  • Hands-on experience with Databricks (notebooks, workflow, clusters, pipelines, jobs, reports, delta lake).
  • Expertise in workflow orchestration tools such as Airflow, Azure Data Factory, or Control-M.
  • Advanced proficiency in SQL (complex joins, CTEs, window functions, query optimization, stored procedures).
  • Strong scripting skills in Python (pandas, PySpark, unittest/pytest) and Java/JavaScript for test automation framework in Selenium, TestNG, SOAPUI for backend API testing.
  • Ability to interpret complex transformation logic and translate it into test validation rules.
  • Strong knowledge of data warehouse concepts, star/snowflake schemas, fact/dimension validation.
  • Experience testing structured and semi-structured data (JSON, XML, Parquet, Avro, CSV) and comparisons.
  • Experience with data quality frameworks and metadata-driven testing.
  • Experience with defect management tools (JIRA) and test management tools (qTest, Zephyr, TestRail).
  • Exposure to version control (Git), CI/CD pipelines, Bit Bucket and DevOps practices.
  • Strong problem-solving, analytical, and debugging skills.

Target salary range: $80,001 - $120,000. The estimate displayed represents the typical salary range for this position based on experience and other factors.


Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About SAIC