Study Notes - ISTQB Foundation Level - Feb-2023
Study Notes - ISTQB Foundation Level - Feb-2023
To do:
• Leer el glosario
Learning objectives:
K1: Remember - recognize and recall something
• Identify typical objectves of testing
• Be able to recognize the definition of failure within the ISTQB Glossary
• Define risk levels by using likelihhod and impact
K2: Understand - understand the reasons behind something
• Dintinguish between the root cause of a defect and its effects
• Explain the impact of context on the test process
• Expplain the differences and similarities between integration ans system testing
K3: Apply - apply a concept to something
• Identify boundary values for valid and invalid partitions
• Write a defect report, covering a defect found during testing
• Apply a review technique to a work product to find defects
Quality Management
Testing ≠ Quality Assurance
Testing Process:
• Test planning
• Test execution
• Analysing results
• Designing and implementing test
• Reporting testing progress
• Evaluating the system that is being tested
Testing activities:
• Tets planning
Activities that define the objectives of testing and the approach for meeting test
objectives
• Tets monitoring and control
○ Minitoring: comparison of actual progress against planned progress
○ Control: taking actions necessary to meet objectives in test plan
○ Both supported by evaluation of exit criteria (definition of done)
• Tets analysis
System under test is analyzed to identify testable features
○ Determines WHAT to test
• Tets design
Test conditions are elaborated into high-level test and sets of test cases
○ Defines HOW to test
○ Identifying necessary test data to support test conditions
○ Designing the test environment and identifying any requiered tools
• Tets implementation
Testware necessary for test execution is created
○ Defines if you have everything in place to run the tests
○ Developing and priorizing test procedures and potentially creating automated test
scripts.
○ Creating test suites from the test procedures
○ Preparing test data
• Tets execition
○ Recording the IDs and versions of the test item
○ Executing tests either manually or through tools
○ Comparing actual results with expected resuluts
○ Logging the Outcome of test execution
• Tets completion
Collect data from completed test activities to consolidate any relevant information
○ Chacking whether all defect reports are closed
○ Creating a test summary report
○ Finalizing and archiving the testware
○ Handing over the testware to the maintenance teams
○ Using the information gathered to improve test process maturity
SDLC models:
• Sequential
Linear sequential flow of activities
○ Waterfall development model
• Iterative and incremental
V model
○ Agile developmen model
Rational Unified Process (RUP)
Scrum
Kanban
Spiral
DevOps
Test Levels:
• Component testing
Focuses on components that are separately testable
○ Reducing risk
○ Verifying component behavior
○ Building confidence in the components quality
○ Finding defects
○ Preventing defect escapes to higher test levels
• Integration testing
Focuses on interactions between components or systems
○ Reducing risk
○ Verifying interface behavior
○ Building confidence in the quality of the interfaces
○ Finding defects
○ Preventing defect escapes to higher test levels
Component Integration Testing
Interactions and interfaces between integrated components
System Integration Testing
Interactions and interfaces between systems, packages and microservices
• System testing
Focuses on the behavior and capavilities of a whole system or product
Often considering the end to end task the system can perform.
○ Reducing risk
○ Verifying total system behavior
○ Verifying the system is complete
○ Building confidence in the quality of the system as a whole
○ Finding defects
○ Preventing defect escapes to higher test levels or to production
○ Satisfy legal or regulatory requirements
• Acceptance testing
Focuses on the behavior and capavilities of a whole system or product
○ Establish confidence in the quality of the system as a whole
○ Validating system is complete
○ Verifying system behaves as specified
○ User acceptance testing
Test Types:
• Functional Testing
Envolves tests that valuate functions that the system should perform
Black box techniques may be used to derive test conditions and test cases for the
functionality of the component or system
○ The functions are WHAT the system should do.
○ Considers the behavior of the software
○ Test the functionality of the component or system
○ May involve special skills (a particular business problem that the software solves)
• Nonfunctional Testing
○ Evaluates characteristics of systems and softwares such as
Performance
Usability
Performance efficiency
Security
○ Tests HOW well the system behaves
○ Should be performed at all levels and early
○ Late discovery of nonfunctional issues can be dangerous
• White Box Testing
○ Derivs tests based on the system's internal structure
Internal structure may include:
Code
Architecture
Workflows
Data flows
○ May require knowledge of the way the code is built, how data is stored
• Change Related Testing
Testing following a change to the system from a fix, upgrade or new feature, should be
donde to confirm that the changes have corrected de defect or implemented the
function correctly
○ Confirmation testing, after a defect is fixed - Testing failed cases due to defect
○ Regression testing - Testing other areas of the system to ensure stability
Maintenance triggers:
• Modification
• Migration
Operational test of the new enviroment
Operational test of the change software
• Retirement
Impact Analysis:
Evaluate the changes that were made for the maintenance release to identify the intended
consecuences as well as expected and possible side effects of the change.
Can also help identify the impact of a change on existing tests.
• Evaluate the changes
• Identify the consequences
• Identify the impact of the change
• Can be done before a change is made
Static Testing:
• Static analysis is importante for safety critical computer systems
Aviation, medical or nuclear software
• Is often incorporated into automated software build and distribution tools
In agile development, continuous delivery and deployment
• Almost any work product can be examined using static testing
Specifications and requierement (business, functional, security requirements)
Epics, users stories, acceptance criteria
Arquitecture and d
Design specifications
Code
User guides
Contracts
• Finds defects in work products directly rather than identify failures caused by defects when
the software is run
Test Techniques
Test Managment
*Developers should participate in testing, specially at lower levels, so as to exercise control over the
quality of their own work.
*The way in wich independence of testing is implementing varies depending on the software
development lifecycle model.
Test Manager:
The test manager is tasked with the overall responsability for the test process and the successful
leadership of the test activities
• Develop or review a test policy or strategy for the organization
• Plan the test activities by considering the context and understanding the test objectives and
risks
• Write and update the test plans
• Coordinate the test plan with project managers and product owners
• Monitor test progress and results and checking the status of exit criteria to facilitate test
completion activities
• Prepare and deliver test progress reports and test sumary reports based on the information
gathered
• Adapt plpanning based on test results and the progress and take any actions necessary for test
control
• Introduce suitable metrics for measuring test progress and evaluate the quality of the testing
and the product
• Support the selecction and implementation of tools
• Promote and advocate for the testers
• Develop the skills and careers of testers
Tester:
• Review and contribute to test plans
• Analyze, review and assess requirements, user stories and aceptance criteria
• Identify and document test conditions and capturing traceability between test cases, test
conditions and the test basis
• Design, set up and verify test environments often coordinating with system administration and
network managment
• Design and implement test cases and test procedures
• Prepare and acquire test data
• Create test execution schedule
• Execute test, evaluate the results and document any deviations fromexpected results
• Automate tests as needed
• Evaluate nonfunctional characteristics such as performance, efficiency, reliability and security
• Review test developed by others
Test Strategies:
• Analytical -->(Risk based testing tecnique)
• Model-based
• Methodical
• Process-compliant
• Directed or consultative
• Regression-averse
• Reactive --> (Exploratory testing tecnique)