Overview
On Site
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)
Skills
API
Access Control
Accessibility
Automated Testing
Business Rules
Customer Relationship Management (CRM)
Enterprise Resource Planning
Plant Lifecycle Management
Product Lifecycle Management
Quality Assurance
Merchandising
Middleware
Regression Testing
Integration Testing
Scalability
NoSQL
Process Improvement
Digital Asset Management
Data Validation
Data Security
Dashboard
Software Development Methodology
User Experience
Web Testing
Software Testing
Reporting
Sprint
SQL
Job Details
Position: PLM Quality Assurance (QA)
Location: San Ramon, CA Onsite
Job Type: Long Term Contract
Responsibilities
1. User Interface (UI) Testing
- Design, execute, and refine test cases for PLM system UI components to ensure accuracy, consistency, and superior user experience.
- Conduct usability, accessibility, and cross-browser/device testing.
- Collaborate with UI/UX configurators, product owners, and developers to resolve usability issues and align outcomes with user expectations.
2. Integration Testing (PLM Process Alignment)
- Test and validate integrations between PLM and enterprise platforms (ERP, Design Tools, Merchandising Tools, CRM, DAM, Reporting Tools, etc.).
- Ensure data, workflows, and automation align with the full PLM process supporting traceability from concept to release.
- Validate integration logic for change control, configuration management, and release workflows.
- Conduct API and middleware validation, regression testing, and troubleshooting.
3. Customization and Configuration Testing
- Test system customizations, business rules, workflows, and lifecycle configurations to confirm expected behavior.
- Verify new configurations integrate smoothly without disrupting baseline functionality.
- Provide feedback on scalability, maintainability, and long-term quality of system enhancements.
4. Reporting and Data Validation
- Validate accuracy and performance of PLM-generated reports and dashboards.
- Cross-check data outputs with backend sources to ensure business logic consistency.
- Test access controls, data security, and performance of reporting features.
5. Test Planning, Execution, and Continuous Improvement
- Participate in sprint planning, backlog grooming, and requirements reviews to ensure complete test coverage.
- Execute manual and automated test cases, improving them dynamically based on insights.
- Identify root causes of defects and recommend both tactical fixes and strategic process improvements.
- Maintain detailed, actionable defect reports in Jira or similar tools.
- Contribute to QA maturity by advancing testing standards, automation coverage, and traceability practices.
Qualifications & Skills
Required
- Bachelor s degree in Computer Science, Engineering, or a related technical field.
- 3+ years of experience in software testing or quality engineering (enterprise-scale systems preferred).
- Strong understanding of QA methodologies, SDLC, and Agile/Scrum practices.
- Hands-on experience testing web applications, APIs, and integrated platforms.
- Proficiency in test automation frameworks and SQL/NoSQL data validation.
- Strong analytical and problem-solving abilities, with adaptability to improve test design during execution.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.