Job Title: Test Lead – Workday ERP Implementation
Duration: 3+ Months (Potential extension)
Location: Hybrid – Minimum 25% onsite per month in Olympia, WA
Job Description:
The project will implement the Workday solution by remediating systems, assessing future-state business processes, and addressing impacts on enterprise systems and stakeholders. The primary objective of this engagement is to ensure the successful planning, execution, and management of all testing and cutover activities related to the Phase 1A Core Financials Workday implementation. This includes the testing and validation of all configured functionalities, integrations, and business processes to confirm that they meet organizational requirements.
The Test Lead is responsible for developing a comprehensive test strategy and plan, coordinating test activities, managing defects, and ensuring that all identified issues are resolved in a timely manner. The goal is to achieve a high-quality, defect-free deployment of the Workday system while ensuring that enterprise systems, other integrated systems (including HRMS), and business processes work seamlessly together. This will minimize risks and ensure a smooth transition to the new platform.
Substantial foundational testing and business process work has already been completed in support of this project. This includes the development of a comprehensive Testing Strategy and high-level Testing Plan, along with the completion of key deliverables capturing current-state and target-state business processes, gap analysis outputs, and initial test cases.
These deliverables reflect extensive effort to document as-is and to-be processes, identify the impacts and gaps between current and future business processes, and establish a baseline understanding of system functionality and business readiness. While this work provides a strong foundation for project testing, the enterprise testing approach, sequencing, and phases continue to evolve. As a result, the project-level Testing Strategy and Plan will require ongoing review, refinement, and updates to align with the most current program guidance, timelines, environments, and dependencies.
The Test Lead will assess the existing testing artifacts, identify and address any gaps, update the Testing Strategy and Plan as needed, and lead execution of testing activities through the remaining project phases.
The primary focus areas for this work are outlined in the sections below. The Project Manager will meet regularly with the Test Lead to identify and prioritize tasks, activities, deliverables, and associated work packages to be addressed during upcoming project periods.
1.1 Test Strategy & Planning
- Enhance the project testing strategy and plan to comprehensively include all aspects of business process transformation and technology modernization.
- Continuously update the Test Plan to reflect program testing phases, timelines, milestones, and requirements.
- Utilize Azure DevOps (ADO) to develop a method for tracking, monitoring, documenting, and reporting testing status and results, including designing dashboards that clearly communicate testing progress at any given point.
- Collaborate with program leadership, particularly testing directors and leads, to pilot testing concepts on behalf of enterprise stakeholders impacted by the Core Financials implementation.
- Define test metrics and key performance indicators (KPIs) to assess the effectiveness and efficiency of testing activities.
- Establish a risk-based testing approach that considers business criticality, integration complexity, change impact, and frequency of use.
- Evaluate project requirements, program requirements, and user stories to determine appropriate test coverage based on risk assessments.
- Identify and prioritize high-risk use cases with significant impact on critical business processes, financial data integrity, and system integration points.
- Determine the appropriate level of testing for each use case based on risk level, complexity, and business impact.
- Allocate testing resources and efforts to ensure adequate coverage of high-risk and business-critical areas.
- Plan and coordinate all testing activities, including resource allocation, scheduling, dependencies, and risk management.
- Align the project test strategy with the overall program testing framework.
- Sequence and coordinate testing activities across project teams, program teams, and vendors.
- Communicate testing status, risks, and coverage information to support project readiness and decision-making discussions.
- Develop documentation for the testing approach that reflects agreed priorities, assumptions, constraints, and alignment with program-level testing.
- Continuously monitor and adjust test coverage and prioritization based on project progress, changes in requirements, and identified risks.
Expectations:
- Risk-Based, Enterprise-Aligned Strategy
- Implement a risk-based testing approach prioritizing business-critical workflows, high-risk integrations, and mission-critical financial processes.
- Testing scope and prioritization should consider:
- Business criticality
- Integration complexity
- Frequency of use
- Change impact
- Historical defect trends (where available)
- The Test Strategy must align with and integrate into the overall program master test plan and schedule.
- The primary objective is to prepare the organization for testing and implementation of the Phase 1A Core Financials deployment.
- Stakeholder-Driven Planning
- Subject matter experts (SMEs), technical leads, and program stakeholders are expected to actively participate in defining scope, validating priorities, and approving testing artifacts.
- The Test Lead is responsible for coordinating and maintaining this engagement throughout the testing phases.
1.2 Test Case Development
- Define test coverage objectives for end-to-end, integration, and organization-specific scenarios.
- Develop test cases covering Workday functionality, legacy system remediation, new and modified business processes, and integrations with Workday.
- Develop test cases supporting multiple testing types including:
- Functional testing
- Integration testing
- End-to-end testing
- Performance testing
- User Acceptance Testing (UAT)
- Develop supplemental test cases to address organization-specific gaps not covered by program-provided scenarios.
- Ensure test cases are comprehensive, accurate, and aligned with approved project requirements and business processes.
- Develop and maintain a centralized test case repository using Azure DevOps (ADO) to support traceability, reuse, and consistency.
- Develop and maintain traceability between test cases, requirements, and identified risks.
Expectations:
All test cases developed under this task must meet the following standards:
- Test cases must provide end-to-end and integration-focused coverage, including legacy system remediation, Workday business processes, system-to-system integrations (such as API and SFTP), and full business workflows.
- Organization-specific gaps not covered by program-provided scenarios must be identified and addressed through supplemental test cases.
- All test cases must be traceable to approved requirements and identified risks through a Requirements Traceability Matrix (RTM).
- Test cases and the RTM must undergo formal review and approval by subject matter experts (SMEs), the Project Manager, and the Test Lead prior to execution.
- Test case coverage, traceability status, and closure must be documented and communicated throughout the testing lifecycle.
1.3 Test Execution and Defect Management
- Plan and coordinate execution of all applicable project and program test cases across all testing phases.
- Manage the execution of test cases and coordinate with subject matter experts (SMEs) to ensure participation, feedback, and validation of results.
- Develop an approach for collecting, consolidating, and reporting test execution results across all testers.
- Develop testing status reports, defect trends, and risk summaries to support escalation and readiness discussions.
- Define defect severity levels, prioritization, and escalation procedures aligned with project and program standards.
- Manage the defect lifecycle, including defect reporting, assignment, tracking, retesting, and closure.
- Plan and conduct regular defect review and triage meetings with stakeholders to support timely issue resolution and visibility into testing risks.
- Coordinate with development and technical teams to support timely investigation and resolution of defects.
Expectations:
- Structured Execution Across All Testing Phases
The Test Lead is responsible for managing test execution across:
- System Testing
- Integration Testing
- Regression Testing
- Business-led User Acceptance Testing (UAT)
Business users are expected to execute and validate real-world scenarios during UAT, not just technical success paths.
Disciplined Defect Management:
- Azure DevOps (ADO) will be the system of record for defect tracking and test execution.
- Defects must be prioritized based on severity with clearly defined resolution targets.
- Regular defect triage meetings will be conducted, with escalation of unresolved blockers to ensure project timelines and cutover schedules are protected.
1.4 Test Data and Environment Management
- Define test data and environment requirements necessary to support planned testing activities.
- Plan for the availability, stability, and coordination of test environments and required system integrations.
- Develop an approach for preparing, validating, and maintaining test data that supports priority business scenarios and integration workflows.
- Ensure test data is representative of production data and supports agreed priority and high-risk scenarios.
- Manage test environments, where applicable, to ensure they are available, stable, and appropriately configured for testing activities.
- Coordinate with technical teams to set up and maintain test environments, including databases, servers, and network configurations.
- Identify, track, and communicate test readiness dependencies, constraints, and risks that may impact testing activities.
- Define data handling and protection approaches to ensure compliance with organizational security and privacy standards.
Expectations:
- Readiness Before Testing
- Test environments must be stable and properly configured before testing begins.
- Data integrity must be maintained across regression testing iterations.
- System integrations must be active and available for testing.
- Test data must be available, validated, and compliant with privacy standards.
- Environment or data readiness issues should be tracked and escalated as program risks where necessary.
Representative and Secure Data:
- Test data should reflect production-like scenarios, including edge cases and integration data.
- Sensitive data must be masked or obfuscated in accordance with organizational security standards.
1.5 Test Automation
- Evaluate and recommend testing automation tools and frameworks appropriate for the project.
- Develop and maintain test automation scripts to support regression testing and continuous integration testing.
- Integrate test automation into the overall testing lifecycle to improve efficiency, coverage, and repeatability.
- Monitor project performance and track testing progress against defined timelines and objectives.
- Plan, monitor, and track key metrics to measure progress and overall testing performance.
- Identify the root causes of project performance shortfalls and recommend opportunities for improvement.
- Develop and deliver project communications and status updates in the form of documentation, presentations, reports, and other communication materials as required.
- Design and facilitate regular and ad-hoc project meetings and participate in other project-related meetings as necessary.
Expectations:
Automation efforts should focus on:
- High-risk scenarios
- Repetitive testing activities
- Regression-heavy processes
Automation is intended to increase test coverage and efficiency, while complementing — not replacing — business validation activities.
Governed AI Usage:
- AI-enabled tools (such as automation assistance or synthetic test data generation) may only be used following organizational security and governance approvals.
- If AI-enabled tools are delayed or not approved, manual testing must proceed without reducing quality expectations.
1.6 Cutover Planning and Execution
- Define testing-related inputs required to support cutover planning and go-live readiness discussions.
- Plan testing activities and sequencing to align with program cutover timelines, checkpoints, and dependencies.
- Develop testing status reports, defect summaries, and risk information to support cutover coordination and readiness assessments.
- Plan for and participate in cutover planning, cutover plan reviews, mock cutovers, and cutover execution activities from a testing perspective.
- Lead and manage cutover planning and execution activities in coordination with the broader program teams.
- Respond to and coordinate questions or escalations related to organizational cutover activities, working with the cutover team to resolve open items.
- Manage cutover tasks, risks, issues, and dependencies to ensure alignment with the approved cutover plan of record.
- Coordinate with system owners, business subject matter experts, technical teams, and project workstreams to ensure that cutover and system remediation activities, deliverables, and milestones are completed as planned.
- In coordination with the Project Manager, develop and provide cutover and remediation progress reporting to program leadership and stakeholders.
Expectations:
- Testing Exit Criteria for Cutover
Cutover readiness is contingent upon the following testing conditions:
- All defects are resolved or appropriately dispositioned prior to go-live.
- The Requirements Traceability Matrix (RTM) is complete and validated.
- The User Acceptance Testing (UAT) summary has been reviewed and approved.
- Formal test sign-off confirming readiness for deployment is obtained.
Testing outcomes will serve as a critical go/no-go input for deployment decisions.
Integrated Program Cutover:
- The Test Lead is responsible for coordinating organizational cutover activities with overall program timelines, checklists, and mock cutovers.
- Risks, issues, and readiness gaps must be actively tracked, communicated, and escalated to project leadership.
Preferred Education, Experience, and Competencies:
- Strong commitment to quality assurance and protecting the end-user experience.
- Relevant professional experience in program or portfolio-level coordination, technical leadership, and project management.
- Strong expertise in test case design techniques and testing tools.
- Minimum 10 years of experience as a software testing or development lead supporting large-scale SaaS implementations.
- Proven experience supporting large and complex Enterprise Resource Planning (ERP) implementations.
- Minimum 10 years of experience working with test automation frameworks, scripting languages, and automation tools to improve testing efficiency and coverage.
- Experience creating comprehensive test strategies and test plans, with a strong understanding of software testing methodologies, tools, and frameworks.
- Skilled in defining test scenarios, selecting appropriate test data, and ensuring comprehensive test coverage.
- Strong analytical skills to identify potential vulnerabilities and define appropriate test coverage to support effective defect identification and resolution.
- Proficiency in Azure DevOps (ADO) for test management, tracking, and reporting.
- Experience with cutover planning and execution, including readiness planning and go-live coordination.
- Experience testing SaaS platforms and system integrations, including APIs and data migration scenarios, using automation tools and testing frameworks.
- Strong leadership, communication, and collaboration skills when working with cross-functional teams.
- Ability to work independently while managing multiple priorities and deadlines in complex project environments.