Overview
Skills
Job Details
Job Title- QA Lead
Project Location Adephi, MD 3 days a week onsite
Project Duration- 6-12+ month contract with possibility of extension
With submittals we'll need 2 managerial references.
Client confirmed the role supports complex data products including AI/ML applications, classic BI/reporting, and data exchanges with internal and external systems. This person must be strong with data quality tools, along with having experience using data governance tools Databricks, Azure Purview and Profisee.
Job Description-
I have a new role for a QA Lead that'll have hands-on ownership of Enterprise Data & AI Platform QA and UAT coordination for an Enterprise Data & AI Platform that delivers AI/BI solutions leveraging Databricks implementations.
The platform supports complex data products including AI/ML applications, classic BI/reporting, and data exchanges with internal and external systems.
This person will need to be strong with data quality tools, along with experience using data governance tools such as Azure Purview and Profisee MDM.
- Build and lead a dedicated QA function to support the Enterprise Data & AI Platform, defining clear roles across data, BI, and automation testing.
 - Establish standardized QA processes, templates, and best practices for testing data pipelines, ML workflows, APIs, and BI dashboards.
 - Mentor and upskill team members in data quality validation, SQL-based testing, and the use of tools such as Databricks, Purview, and Profisee.
 - Implement efficient collaboration to align QA activities with Agile sprints, CI/CD pipelines, and UAT readiness.
 
Test Execution & QA Delivery
- Develop, manage, and execute detailed test plans and test cases for: 
- Data pipelines (ETL/ELT) on Databricks and other platforms
 - ML model workflows (training, scoring, deployment) within AI/BI use cases
 - BI dashboards, self-service analytics, and static reports
 - APIs and real-time data integrations including lake federation.
 - Data exchanges with internal systems and external partners
 
 - Perform data validation and quality checks using SQL, Python, and tools such as Monte Carlo and Great Expectations.
 - Lead functional, integration, regression, and smoke testing activities.
 - Track, report, and manage defect lifecycle closely with development teams.
 
UAT Management
- Plan, facilitate, and manage User Acceptance Testing (UAT) involving business users for: 
- AI/BI applications running on Databricks
 - Traditional BI reporting and dashboards
 - Data feeds and integration points supporting internal and external consumption
 
 - Prepare UAT test scenarios aligned with business use cases, guide users through testing, and gather actionable feedback.
 - Drive defect triage, resolution, and retesting during UAT cycles, ensuring readiness for production release.
 
Data Governance & Quality Tools
- Utilize and support the adoption of Azure Purview for data cataloging, lineage, and compliance tracking.
 - Work with Profisee for Master Data Management (MDM) practices that impact data quality and consistency.
 - Leverage data observability tools like Monte Carlo for proactive anomaly detection and data reliability monitoring.
 
Agile Collaboration & Execution
- Work within a SAFe Agile framework, participating in PI planning, sprint ceremonies, and cross-team coordination.
 - Collaborate with DevOps, Data Engineers, Data Scientists, and Product Owners to integrate QA into CI/CD pipelines.
 - Manage environment readiness, test data provisioning, and release activities in line with Agile delivery cadence.
 
Tools & Automation
- Maintain and extend automated test suites for APIs, data pipelines, and data quality validations.
 - Use test management and defect tracking tools such as Azure DevOps, JIRA, and TestRail.
 - Drive continuous improvement of QA automation, tools, and processes.
 
Required Skills & Experience
- 5+ years in QA roles with direct experience in data platform and analytics project environments showcasing building and leading a dedicated QA team to support the Enterprise Data & AI Platform, defining clear roles across data, BI, and automated testing.
 - Proven expertise testing data pipelines, BI tools, and AI/ML workflows on Databricks
 - Hands-on experience with data quality tools and processes.
 - Strong SQL and Python skills for test automation and data validation
 - Experience managing User Acceptance Testing (UAT) with business stakeholders
 - Familiarity with API testing and data integration validation (Postman, REST Assured)
 - Experience working in a SAFe Agile environment
 - Strong communication skills and ability to work with both technical and non-technical stakeholders
 
Preferred Qualifications
- Experience with cloud-native data platforms (Databricks -strongly preferred--, Azure Synapse, Snowflake, Google Cloud Platform BigQuery)
 - Knowledge of ML Ops and AI/ML model testing techniques
 - Familiarity with data governance frameworks and regulatory compliance (GDPR, HIPAA)
 - ISTQB or other QA certification is a plus