Select the roles required in a formal review:
In a formal review, the roles involved typically include the author, management, facilitator (also known as moderator), review leader, reviewers, and scribe. Each role has specific responsibilities to ensure the effectiveness and efficiency of the review process:
The author creates and refines the work product being reviewed.
Management allocates resources and supports the review process.
The facilitator manages the review meeting, ensuring it proceeds smoothly.
The review leader plans the review and ensures it meets its objectives.
Reviewers examine the work product to identify defects.
The scribe records issues raised during the review meeting.
Determining the schedule for each testing activity and test milestones for a test project, using activity estimates, available resources, and other constraints is a typical task performed during
Test planning involves defining the overall approach to testing, including scheduling, resources, and milestones. It is during this phase that the detailed schedule for each testing activity is determined based on estimates, resource availability, and constraints. The ISTQB CTFL Syllabus v4.0 outlines that test planning encompasses the creation of test plans and schedules to ensure that testing activities are properly managed and controlled.
In a two-hour uninterrupted test session, performed as part of an iteration on an Agile project, a heuristic checklist was used to help the tester focus on some specific usability issues of a web application.
The unscripted tests produced by the tester's experience during such session belong to which one of the following testing quadrants?
The unscripted tests produced by the tester's experience during the two-hour test session belong to the testing quadrant Q3. The testing quadrants are a classification of testing types based on two dimensions: the test objectives (whether the testing is focused on supporting the team or critiquing the product) and the test basis (whether the testing is based on the technology or the business). The testing quadrants are labeled as Q1, Q2, Q3, and Q4, and each quadrant represents a different testing perspective, such as unit testing, acceptance testing, usability testing, or performance testing. The testing quadrant Q3 corresponds to the testing types that have the objective of critiquing the product from the business perspective, such as exploratory testing, usability testing, user acceptance testing, alpha testing, beta testing, etc. The unscripted tests performed by the tester in the given scenario are examples of exploratory testing and usability testing, as they are based on the tester's experience, intuition, and learning of the web application, and they focus on some specific usability issues, such as the user interface, the user satisfaction, the user feedback, etc. The other options are incorrect, because:
The testing quadrant Q1 corresponds to the testing types that have the objective of supporting the team from the technology perspective, such as unit testing, component testing, integration testing, system testing, etc. These testing types are usually performed by developers or testers who have access to the source code, the design, the architecture, or the configuration of the software system, and they aim to verify the functionality, the quality, and the reliability of the software system at different levels of integration.
The testing quadrant Q2 corresponds to the testing types that have the objective of supporting the team from the business perspective, such as functional testing, acceptance testing, story testing, scenario testing, etc. These testing types are usually performed by testers or customers who have access to the requirements, the specifications, the user stories, or the business processes of the software system, and they aim to validate that the software system meets the expectations and the needs of the users and the stakeholders.
The testing quadrant Q4 corresponds to the testing types that have the objective of critiquing the product from the technology perspective, such as performance testing, security testing, reliability testing, compatibility testing, etc. These testing types are usually performed by testers or specialists who have access to the tools, the metrics, the standards, or the benchmarks of the software system, and they aim to evaluate the non-functional aspects of the software system, such as the efficiency, the security, the reliability, or the compatibility of the software system under different conditions or environments.Reference: ISTQB Certified Tester Foundation Level (CTFL) v4.0 sources and documents:
ISTQB Certified Tester Foundation Level Syllabus v4.0, Chapter 1.3.1, Testing in Software Development Lifecycles
ISTQB Glossary of Testing Terms v4.0, Testing Quadrant, Exploratory Testing, Usability Testing, Unit Testing, Component Testing, Integration Testing, System Testing, Functional Testing, Acceptance Testing, Story Testing, Scenario Testing, Performance Testing, Security Testing, Reliability Testing, Compatibility Testing
Which of the following is a test task that usually occurs during test implementation?
A test task that usually occurs during test implementation is to make sure the planned test environment is ready to be delivered. The test environment is the hardware and software configuration on which the tests are executed, and it should be as close as possible to the production environment where the software system will operate. The test environment should be planned, prepared, and verified before the test execution, to ensure that the test conditions, the test data, the test tools, and the test interfaces are available and functional. The other options are not test tasks that usually occur during test implementation, but rather test tasks that occur during other test activities, such as:
Find, analyze, and remove the causes of the failures highlighted by the tests: This is a test task that usually occurs during test analysis and design, which is the activity of analyzing the test basis, designing the test cases, and identifying the test data. During this activity, the testers can use techniques such as root cause analysis, defect prevention, or defect analysis, to find, analyze, and remove the causes of the failures highlighted by the previous tests, and to prevent or reduce the occurrence of similar failures in the future tests.
Archive the testware for use in future test projects: This is a test task that usually occurs during test closure, which is the activity of finalizing and reporting the test results, evaluating the test process, and identifying the test improvement actions. During this activity, the testers can archive the testware, which are the test artifacts produced during the testing process, such as the test plan, the test cases, the test data, the test results, the defect reports, etc., for use in future test projects, such as regression testing, maintenance testing, or reuse testing.
Gather the metrics that are used to guide the test project: This is a test task that usually occurs during test monitoring and control, which is the activity of tracking and reviewing the test progress, status, and quality, and taking corrective actions when necessary. During this activity, the testers can gather the metrics, which are the measurements of the testing process, such as the test coverage, the defect density, the test effort, the test duration, etc., that are used to guide the test project, such as planning, estimating, scheduling, reporting, or improving the testing process.Reference: ISTQB Certified Tester Foundation Level (CTFL) v4.0 sources and documents:
ISTQB Certified Tester Foundation Level Syllabus v4.0, Chapter 2.1.1, Test Planning1
ISTQB Certified Tester Foundation Level Syllabus v4.0, Chapter 2.1.2, Test Monitoring and Control1
ISTQB Certified Tester Foundation Level Syllabus v4.0, Chapter 2.1.3, Test Analysis and Design1
ISTQB Certified Tester Foundation Level Syllabus v4.0, Chapter 2.1.4, Test Implementation1
ISTQB Certified Tester Foundation Level Syllabus v4.0, Chapter 2.1.5, Test Execution1
ISTQB Certified Tester Foundation Level Syllabus v4.0, Chapter 2.1.6, Test Closure1
A calculator software is used to calculate the result for 5+6.
The user noticed that the result given is 6.
This is an example of;
According to the ISTQB Glossary of Testing Terms, Version 4.0, 2018, page 18, a failure is ''an event in which a component or system does not perform a required function within specified limits''. In this case, the calculator software does not perform the required function of calculating the correct result for 5+6 within the specified limits of accuracy and precision. Therefore, this is an example of a failure.
The other options are incorrect because:
A mistake is ''a human action that produces an incorrect result'' (page 25). A mistake is not an event, but an action, and it may or may not lead to a failure. For example, a mistake could be a typo in the code, a wrong assumption in the design, or a misunderstanding of the requirement.
A fault is ''a defect in a component or system that can cause the component or system to fail to perform its required function'' (page 16). A fault is not an event, but a defect, and it may or may not cause a failure. For example, a fault could be a logical error in the code, a missing specification in the design, or a contradiction in the requirement.
An error is ''the difference between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition'' (page 15). An error is not an event, but a difference, and it may or may not result in a failure. For example, an error could be a rounding error in the calculation, a measurement error in the observation, or a deviation error in the condition.
Reference= ISTQB Glossary of Testing Terms, Version 4.0, 2018, pages 15-18, 25; ISTQB CTFL 4.0 - Sample Exam - Answers, Version 1.1, 2023, Question 96, page 34.
Katie
16 days agoGayla
2 months agoTamie
3 months agoCasie
4 months agoFrancine
5 months agoWynell
5 months agoLeonie
6 months agoMarylyn
6 months agoMargurite
7 months agoJaclyn
7 months agoLyla
7 months agoVanna
8 months agoEttie
8 months agoMichel
8 months agoKing
9 months agoNoel
9 months agoMoira
9 months agoCarissa
10 months agoShaun
10 months agoGladis
10 months agoKerrie
1 years agoLashaunda
1 years agoKasandra
1 years agoTanesha
1 years agoLeota
1 years agoAlease
1 years agoNoel
1 years ago