Overview
On Site
Depends on Experience
Full Time
Skills
Information Technology
Security Clearance
Business Rules
Data Mapping
Extraction
Test Scripts
Extract
Transform
Load
Flat File
Regression Analysis
System Integration Testing
Dimensional Modeling
Negative Testing
Collaboration
Performance Testing
Test Execution
Dashboard
Agile
UPS
Sprint
Continuous Improvement
Mentorship
Data Engineering
ETL QA
Data Validation
Writing
Test Cases
Data Manipulation
Databricks
Workflow
Orchestration
BMC Control-M
SQL
Query Optimization
Stored Procedures
Scripting
Python
Pandas
PySpark
Unit Testing
Java
JavaScript
Automated Testing
Selenium
TestNG
SoapUI
API QA
Data Warehouse
JSON
XML
Apache Parquet
Apache Avro
Computerized System Validation
Data Quality
Meta-data Management
Defect Management
JIRA
Test Management
Zephyr
TestRail
Version Control
Git
Continuous Integration
Continuous Delivery
DevOps
Conflict Resolution
Problem Solving
Analytical Skill
Debugging
Communication
Laptop
Cloud Computing
Microsoft Azure
Data Lake
Amazon Web Services
Amazon Redshift
Google Cloud Platform
Google Cloud
Snow Flake Schema
Big Data
Apache Spark
Apache Kafka
Data Management
Data Masking
Business Intelligence
Reporting
Tableau
Data Visualization
Regulatory Compliance
HIPAA
Sarbanes-Oxley
Scalability
Testing
Project Coordination
Innovation
Corporate Social Responsibility
Recruiting
Job Details
About Our Company:
Delmock Technologies, Inc. (DTI), is a leading HUBZone business in Baltimore, known for delivering sophisticated IT (Information Technology) and Health solutions with a commitment to ethics, expertise , and superior service. Actively engaged in the local community, DTI creates opportunities for talented residents while maintaining a stellar reputation as an award-winning contractor, earning accolades like the Government Choice Award for IRS (Internal Revenue Service) Systems Modernizations.
Clearance:
Location : This position is remote.
Role:
Delmock Technologies, Inc. is seeking a highly skilled Data Engineer Tester to join our team. The ideal candidate will have strong expertise in ETL, Databricks, workflow orchestration, SQL, Python, Pyspark / Javaspark and Java scripting. The role requires someone capable of analyzing complex data logic, extracting business rules from code, designing and executing test strategies, and automating testing processes to ensure high-quality data solutions.
Responsibilities:
Minimum Requirements:
Preferred/Nice to Have :
Recently ranked as high as #3 among HUBZone Companies in a GOVWIN survey, DTI offers a dynamic environment for those passionate about impactful projects, community involvement, and contributing to top-ranking Federal and State Commissionaires project support team s .
At DTI, we balance continuous growth and innovation with a strong dedication to corporate social responsibility. Join our talented team and be part of a company that values both professional excellence and community impact. Explore the exciting career opportunities awaiting you at DTI!
DTI is committed to hiring and maintaining a diverse workforce. We are an equal opportunity employer making decisions without regard to race, color, religion, sex, national origin, age, veteran status, disability, or any other protected class .
Delmock Technologies, Inc. (DTI), is a leading HUBZone business in Baltimore, known for delivering sophisticated IT (Information Technology) and Health solutions with a commitment to ethics, expertise , and superior service. Actively engaged in the local community, DTI creates opportunities for talented residents while maintaining a stellar reputation as an award-winning contractor, earning accolades like the Government Choice Award for IRS (Internal Revenue Service) Systems Modernizations.
Clearance:
- Active IRS MBI Clearance is required .
Location : This position is remote.
Role:
Delmock Technologies, Inc. is seeking a highly skilled Data Engineer Tester to join our team. The ideal candidate will have strong expertise in ETL, Databricks, workflow orchestration, SQL, Python, Pyspark / Javaspark and Java scripting. The role requires someone capable of analyzing complex data logic, extracting business rules from code, designing and executing test strategies, and automating testing processes to ensure high-quality data solutions.
Responsibilities:
- Analyze business and technical requirements, ensuring complete test coverage across data mapping, data pipelines, transformations, and reporting layers.
- Review code conversion, logic extraction, and validation of complex transformation rules from source to target to identify test coverage
- Develop, execute, and maintain test cases, test scripts, test data, and automation frameworks for ETL or Databricks pipelines, Databricks notebooks, and workflow orchestration jobs.
- Validate data ingestion, transformation, aggregation, cleansing, and reporting logic against business requirements, code conversions, business logics/rules
- Proficiency in validating flat files (CSV, TSV, TXT, fixed-length ) including delimiter handling, header validation, null value handling, and schema verification.
- Conduct data reconciliation between source, staging, and target systems to ensure accuracy and completeness.
- Design and implement SQL- and Python-based automation frameworks for regression, smoke, and system integration testing.
- Test data quality dimensions such as accuracy, completeness, consistency, timeliness, and validity.
- Perform negative testing, boundary testing, and exception handling to ensure robustness of pipelines.
- Collaborate with developers, data engineers, architects, and business analysts to identify data gaps, defects, and performance issues.
- Conduct performance testing of queries and transformations to identify bottlenecks and recommend optimizations.
- Provide clear and detailed defect reports, test execution results, and testing dashboards to stakeholders.
- Support CI/CD integration of automated test scripts into deployment pipelines.
- Participate in Agile ceremonies (stand-ups, sprint planning, retrospectives) and contribute to continuous improvement of test processes.
- Mentor junior team members in best practices for data testing and automation.
Minimum Requirements:
- Bachelor's degree in related field.
- 5+ years of experience in data engineering or data testing roles.
- Proven experience with ETL testing and data validation in large-scale enterprise environments.
- Strong in creating test cases, writing SQL/Python scripts to validate test cases for data manipulation, validations, Reports and Files validations, Comparisons Validation
- Hands-on experience with Databricks (notebooks, workflow, clusters, pipelines, jobs, reports, delta lake).
- Expertise in workflow orchestration tools such as Airflow, Azure Data Factory, or Control-M.
- Advanced proficiency in SQL (complex joins, CTEs, window functions, query optimization, stored procedures).
- Strong scripting skills in Python (pandas, PySpark , unittest / pytest ) and Java/JavaScript for test automation framework in Selenium, TestNG, SOAPUI for backend API testing
- Ability to interpret complex transformation logic and translate it into test validation rules.
- Strong knowledge of data warehouse concepts, star/snowflake schemas, fact/dimension validation.
- Experience testing structured and semi-structured data (JSON, XML, Parquet, Avro, CSV) and comparisons
- Experience with data quality frameworks and metadata-driven testing.
- Experience with defect management tools (JIRA) and test management tools ( qTest , Zephyr, TestRail).
- Exposure to version control (Git), CI/CD pipelines, Bit Bucket and DevOps practices.
- Strong problem-solving, analytical, and debugging skills.
- Excellent written and verbal communication skills to interface with Development team, client and technical stakeholders team.
Preferred/Nice to Have :
- IRS GFE (badge, laptop)
- Experience with cloud data platforms (Azure Data Lake, AWS Redshift, Google Cloud Platform BigQuery , Snowflake).
- Knowledge of Big Data technologies (Spark, Kafka).
- Hands-on experience with test data management and data masking for compliance.
- Familiarity with BI/reporting tools (Tableau) to validate data visualization against backend logic.
- Prior experience in federal or regulated environments with compliance standards (HIPAA, IRS, CMS, SOX).
- Background in performance and scalability testing of data pipelines.
Recently ranked as high as #3 among HUBZone Companies in a GOVWIN survey, DTI offers a dynamic environment for those passionate about impactful projects, community involvement, and contributing to top-ranking Federal and State Commissionaires project support team s .
At DTI, we balance continuous growth and innovation with a strong dedication to corporate social responsibility. Join our talented team and be part of a company that values both professional excellence and community impact. Explore the exciting career opportunities awaiting you at DTI!
DTI is committed to hiring and maintaining a diverse workforce. We are an equal opportunity employer making decisions without regard to race, color, religion, sex, national origin, age, veteran status, disability, or any other protected class .
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.