Overview
Hybrid
Depends on Experience
Contract - Independent
Contract - W2
Skills
Atlassian toolset
Confluence
Job Details
Role - QA Analyst
Location - HYBRID: 2750 Monroe Blvd Audubon PA 19403
Roles & Responsibilities -
The Quality Analyst will be responsible for test planning, test monitoring and control, test analysis and design, test implementation and test execution, evaluating exit criteria and reporting, and test closure.
The Quality Analyst will be directly involved in hands-on, technical work including test data provisioning and test data management. Specific duties include the following.
- Test planning: define the overall strategic and tactical objectives for testing software changes at different test levels. Work with developers, project managers and business customers to define the strategy to be used, such as risk-based testing. Test levels include component testing, integration testing, system testing and acceptance testing. Test types include functional testing, non-functional testing, structural ("white-box") testing, confirmation testing and regression testing.
- Test monitoring and control: continuously compare test progress with the plan, adjust the plan and testing activities as necessary, and provide status reports.
- Test analysis and design: transform testing objectives into test conditions and test cases. The test basis includes documented requirements, system architecture, behavior and structure of the software, existing data, and data flows. Using structural ("white-box") test techniques, among others, design tests, or provide input into test design by identifying specific test conditions and high-level test cases.
- Test implementation and test execution: develop and prioritize test procedures, set up the test environment and test data, and execute tests. Test changes to the database components of the system under test. Testable components include views, procedures and functions, data conversion and migration programs. Support business customers and others in acceptance testing. Includes identifying database model changes in higher environments (Production and Stage), and making those changes in the lower environments (Test and Development).
- Evaluating exit criteria and reporting, and test closure: asses test execution against the objectives defined in the test plan. Specific tasks and deliverables are defined at the team or project level.
- Identify the necessary test data to support test conditions and test cases as they are defined. Includes data to force error and boundary conditions. Analyze input data, including electronic files, message traffic and variations of user input.
- Provide expected test results, and/or repeatable methods of generating expected test results, based on currently existing data. Includes preparing database queries and guides for testers to use.
- Provide tools and methods to compare expected and actual test results. Includes bi-directional comparisons of database data with electronic files, message streams, and front-end displays.
- Provide high-quality, realistic, fit for purpose and referentially intact test data. Capture end-to-end business processes and the associated data for testing. Subset production data: extract subsets of production data from multiple sources to meet test cases and/or to supply input values for data-driven testing. Create realistic test data sets small enough to support rapid test runs but large enough to accurately reflect the variety of production data.
- Provide test data management. Script the setup of data in the application database to put it in a state that allows a specific set of test cases to be run against it. Script the creation of data files and message streams which require changing variables (usually date or timestamp related) to test applications which, process them. Script database cleanups.
- Ensure compliance with PJM Software Testing standards and policies
Required Skills
- 7+ years of experience as a quality assurance analyst or tester
- Strong customer and business focus
- Strong communication skills
- Experience with user-centered design and usability testing
- Experience with (Atlassian toolset) and Confluence
- Understanding of Oracle database and able to write SQL queries
- Experience with test driven or behavior driven development practices
- Someone who is energetic and passionate about their work, extremely positive and solution driven
- Someone who has worked on large teams, on projects that have different business owners
- Experience using iterative development methodologies, specifically Agile Scrum
- Scripting experience using applicable languages
Preferred Skills
- Experience in a SOA / Web Services environment
- Experience using automated testing tools SOAPUI, TOSCA, Scriptless Automation
- Experience hands-on testing complex technical applications
- Experience in communicating with Business Leadership and stakeholders
Roopesh Pratap Singh
Phone: +1
Email:
LinkedIn:
103 Morgan Ln, Plainsboro Township, NJ 08536, USA
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.