1
Chapter 9 – Software
Verification and Validation
2
Topics covered
Development testing
Release testing
User testing
Chapter 9 Software Verification and Validation
3
Program Testing
Testing is intended to show that a program does what it is intended
to do and to discover program defects before it is put into use.
When you test software, you execute a program using artificial
data.
You check the results of the test run for errors, anomalies or
information about the program’s non-functional attributes.
Can reveal the presence of errors NOT their
absence.
Testing is part of a more general verification and validation
process, which also includes static validation techniques.
Chapter 9 Software Verification and Validation
4
Program Testing Goals
To demonstrate to the developer and the customer that the software
meets its requirements.
For custom software, this means that there should be at least one test for every
requirement in the requirements document. For generic software products, it
means that there should be tests for all of the system features, plus combinations
of these features, that will be incorporated in the product release.
To discover situations in which the behavior of the software is incorrect,
undesirable or does not conform to its specification.
Defect testing is concerned with rooting out undesirable system behavior such
as system crashes, unwanted interactions with other systems, incorrect
computations and data corruption.
Chapter 9 Software Verification and Validation
5
Validation And Defect Testing
The first goal leads to validation testing
You expect the system to perform correctly using a given set of test cases that reflect the
system’s expected use.
The second goal leads to defect testing
The test cases are designed to expose defects. The test cases in defect testing can be
deliberately obscure and need not reflect how the system is normally used.
Chapter 9 Software Verification and Validation
6
Testing Process Goals
Validation testing
To demonstrate to the developer and the system customer that the software
meets its requirements
A successful test shows that the system operates as intended.
Defect testing
To discover faults or defects in the software where its behavior is incorrect or
not in conformance with its specification
A successful test is a test that makes the system perform incorrectly and so
exposes a defect in the system.
Chapter 9 Software Verification and Validation
7
An Input-output Model of Program
Testing
Chapter 9 Software Verification and Validation
8
Verification vs Validation
Verification:
"Are we building the product right”.
The software should conform to its specification.
Validation:
"Are we building the right product”.
The software should do what the user really requires.
Chapter 9 Software Verification and Validation
9
V & V confidence
Aim of V & V is to establish confidence that the system is ‘fit for purpose’.
Depends on system’s purpose, user expectations and marketing
environment
Software purpose
The level of confidence depends on how critical the software is to an organisation.
User expectations
Users may have low expectations of certain kinds of software.
Marketing environment
Getting a product to market early may be more important than finding defects in the
program.
Chapter 9 Software Verification and Validation
10
Inspections and Testing
Software inspections Concerned with analysis of
the static system representation to discover problems (static verification)
May be supplement by tool-based document and code analysis.
Software testing Concerned with exercising and observing product
behaviour (dynamic verification)
The system is executed with test data and its operational behaviour is observed.
Chapter 9 Software Verification and Validation
11
Inspections and Testing
Chapter 9 Software Verification and Validation
12
Inspections, Reviews, Audits
Software Inspections
Involve people examining the source representation with the aim of discovering anomalies and
defects
Do not require execution of a system
► May be used before implementation.
May be applied to any representation of the system
► Requirements, design, test data, etc.
Very effective technique for discovering errors
Many different defects may be discovered in a single inspection
► In testing, one defect may mask another so several executions are required.
Reuse of domain and programming knowledge
► Reviewers are likely to have seen the types of error that commonly arise.
Chapter 9 Software Verification and Validation
Program Inspections
Formalised approach to document reviews
Intended explicitly for defect DETECTION (not correction)
Defects may be
►logical errors.
►anomalies in the code that might indicate an erroneous condition
(e.g. an uninitialized variable).
►non-compliance with standards.
Chapter 9 Software Verification and Validation
Inspection Procedure
System overview presented to inspection team.
Code and associated documents are distributed to inspection
team in advance.
Inspection takes place and discovered errors are noted.
Modifications are made to repair discovered errors.
Re-inspection may or may not be required.
Chapter 9 Software Verification and Validation
Inspection Teams
Made up of at least 4 members.
Author of the code being inspected.
Inspector who finds errors, omissions and inconsistencies.
Reader who reads the code to the team.
Moderator who chairs the meeting and notes discovered
errors.
Other roles are Scribe and Chief moderator.
Chapter 9 Software Verification and Validation
Inspection Checklists
Checklist of common errors should be used to drive the inspection.
Error checklist is programming language dependent.
The 'weaker' the type checking, the larger the checklist.
Examples
► Initialisation
► Constant naming
► Loop termination
► Array bounds
Chapter 9 Software Verification and Validation
Inspection Rate
500 statements/hour during overview.
125 source statement/hour during individual
preparation.
90-125 statements/hour can be inspected.
Inspection is therefore an expensive process.
Inspecting 500 lines costs about 40 person/hours.
Chapter 9 Software Verification and Validation
19
Advantages of Inspections
During testing, errors can mask (hide) other errors. Because inspection is
a static process, you don’t have to be concerned with interactions
between errors.
Incomplete versions of a system can be inspected without additional
costs. If a program is incomplete, then you need to develop specialized
test harnesses to test the parts that are available.
As well as searching for program defects, an inspection can also
consider broader quality attributes of a program, such as compliance
with standards, portability and maintainability.
Chapter 9 Software Verification and Validation
20
Review
During a review, a group of people examine the software and its
associated documentation, looking for potential problems and non-
conformance with standards.
The review team makes informed judgments about the level of quality
of a system or project deliverable.
Project managers may then use these assessments to make planning
decisions and allocate resources to the development process.
The purpose of reviews is to improve software quality, not to assess the
performance of people in the development team.
Chapter 9 Software Verification and Validation
21
Review Process
1. Pre-review activities
o Typically, pre-review activities are concerned with review planning and
review preparation.
o Review planning involves setting up a review team, arranging a time
and place for the review, and distributing the documents to be
reviewed.
o During review preparation, the team may meet to get an overview of
the software to be reviewed.
Chapter 9 Software Verification and Validation
22
Review Process
2. The review meeting
o During the review meeting, an author of the document or program being
reviewed should ‘walk through’ the document with the review team.
o One team member should chair the review and another should formally record all
review decisions and actions to be taken.
o During the review, the chair is responsible for ensuring that all written comments
are considered.
o The review chair should sign a record of comments and actions agreed during the
review.
Chapter 9 Software Verification and Validation
23
Review Process
3. Post-review activities
o After a review meeting has finished, the issues and problems raised during
the review must be addressed.
o This may involve fixing software bugs, refactoring software so that it
conforms to quality standards, or rewriting documents.
o After changes have been made, the review chair may check that the
review comments have all been taken into account.
Chapter 9 Software Verification and Validation
24
Audit
A systematic and independent examination to determine
whether quality activities comply with specifications and whether
these specifications are implemented effectively
General audit process:
o Prepare and Plan
o Characterize the Development Process
o Evaluate and Report
o Follow-up
Chapter 9 Software Verification and Validation
25
Inspections and Testing
Inspections and testing are complementary and not opposing
verification techniques.
Both should be used during the V & V process.
Inspections can check conformance with a specification but
not conformance with the customer’s real requirements.
Inspections cannot check non-functional characteristics such
as performance, usability, etc.
Chapter 9 Software Verification and Validation
26
A Model of The Software Testing
Process
Chapter 9 Software Verification and Validation
27
Stages of Testing
Development testing, where the system is tested during development to
discover bugs and defects.
Release testing, where a separate testing team test a complete version of
the system before it is released to users.
User testing, where users or potential users of a system test the system in
their own environment.
Chapter 9 Software Verification and Validation
28
Development Testing
Development testing includes all testing activities that are carried out by the team
developing the system.
Unit testing, where individual program units or object classes are tested. Unit testing should
focus on testing the functionality of objects or methods.
Component testing, where several individual units are integrated to create composite
components. Component testing should focus on testing component interfaces.
System testing, where some or all of the components in a system are integrated and the
system is tested as a whole. System testing should focus on testing component
interactions.
Chapter 9 Software Verification and Validation
29
Unit Testing
Unit testing is the process of testing individual components in isolation.
It is a defect testing process.
Units may be:
Individual functions or methods within an object
Object classes with several attributes and methods
Composite components with defined interfaces used to access their
functionality.
Chapter 9 Software Verification and Validation
30
Object Class Testing
Complete test coverage of a class involves
Testing all operations associated with an object
Setting and interrogating all object attributes
Exercising the object in all possible states.
Inheritance makes it more difficult to design object class tests as the information to
be tested is not localised.
Chapter 9 Software Verification and Validation
31
The Weather Station Object Interface
Chapter 9 Software Verification and Validation
32
Weather Station Testing
Need to define test cases for reportWeather, calibrate, test, startup and
shutdown.
Using a state model, identify sequences of state transitions to be tested
and the event sequences to cause these transitions
For example:
Shutdown -> Running-> Shutdown
Configuring-> Running-> Testing -> Transmitting -> Running
Running-> Collecting-> Running-> Summarizing -> Transmitting -> Running
Chapter 9 Software Verification and Validation
33
Automated Testing
Whenever possible, unit testing should be automated so that tests are run and
checked without manual intervention.
In automated unit testing, you make use of a test automation framework (such as
JUnit) to write and run your program tests.
Unit testing frameworks provide generic test classes that you extend to create
specific test cases. They can then run all of the tests that you have implemented and
report, often through some GUI, on the success of otherwise of the tests.
Chapter 9 Software Verification and Validation
34
Automated Test Components
A setup part, where you initialize the system with the test case, namely the inputs
and expected outputs.
A call part, where you call the object or method to be tested.
An assertion part where you compare the result of the call with the expected
result. If the assertion evaluates to true, the test has been successful if false, then it
has failed.
Chapter 9 Software Verification and Validation
35
Unit Test Effectiveness
The test cases should show that, when used as expected, the component
that you are testing does what it is supposed to do.
If there are defects in the component, these should be revealed by test
cases.
This leads to 2 types of unit test case:
The first of these should reflect normal operation of a program and should show
that the component works as expected.
The other kind of test case should be based on testing experience of where
common problems arise. It should use abnormal inputs to check that these are
properly processed and do not crash the component.
Chapter 9 Software Verification and Validation
36
Testing Strategies
Partition testing, where you identify groups of inputs that have common characteristics
and should be processed in the same way.
You should choose tests from within each of these groups.
Guideline-based testing, where you use testing guidelines to choose test cases.
These guidelines reflect previous experience of the kinds of errors that programmers often make
when developing components.
Chapter 9 Software Verification and Validation
37
Partition Testing
Input data and output results often fall into different classes where all members of a
class are related.
Each of these classes is an equivalence partition or domain where the program
behaves in an equivalent way for each class member.
Test cases should be chosen from each partition.
Chapter 9 Software Verification and Validation
38
Equivalence Partitioning
Chapter 9 Software Verification and Validation
39
Equivalence Partitions
Chapter 9 Software Verification and Validation
40
Testing Guidelines (Sequences)
Test software with sequences which have only a single value.
Use sequences of different sizes in different tests.
Derive tests so that the first, middle and last elements of the sequence are
accessed.
Test with sequences of zero length.
Chapter 9 Software Verification and Validation
41
General Testing Guidelines
Choose inputs that force the system to generate all error messages
Design inputs that cause input buffers to overflow
Repeat the same input or series of inputs numerous times
Force invalid outputs to be generated
Force computation results to be too large or too small.
Chapter 9 Software Verification and Validation
42
Component Testing
Software components are often composite components that are made up of
several interacting objects.
For example, in the weather station system, the reconfiguration component includes
objects that deal with each aspect of the reconfiguration.
You access the functionality of these objects through the defined component
interface.
Testing composite components should therefore focus on showing that the
component interface behaves according to its specification.
You can assume that unit tests on the individual objects within the component have been
completed.
Chapter 9 Software Verification and Validation
43
Interface Testing
Chapter 9 Software Verification and Validation
44
Interface Testing
Objectives are to detect faults due to interface errors or invalid assumptions about
interfaces.
Interface types
Parameter interfaces Data passed from one method or procedure to another.
Shared memory interfaces Block of memory is shared between procedures or functions.
Procedural interfaces Sub-system encapsulates a set of procedures to be called by other
sub-systems.
Message passing interfaces Sub-systems request services from other sub-systems
Chapter 9 Software Verification and Validation
45
Interface Errors
Interface misuse
A calling component calls another component and makes an error in its use
of its interface e.g. parameters in the wrong order.
Interface misunderstanding
A calling component embeds assumptions about the behaviour of the
called component which are incorrect.
Timing errors
The called and the calling component operate at different speeds and out-
of-date information is accessed.
Chapter 9 Software Verification and Validation
46
Interface Testing Guidelines
Design tests so that parameters to a called procedure are at
the extreme ends of their ranges.
Always test pointer parameters with null pointers.
Design tests which cause the component to fail.
Use stress testing in message passing systems.
In shared memory systems, vary the order in which components
are activated.
Chapter 9 Software Verification and Validation
47
System Testing
System testing during development involves integrating components to create a
version of the system and then testing the integrated system.
The focus in system testing is testing the interactions between components.
System testing checks that components are compatible, interact correctly and
transfer the right data at the right time across their interfaces.
System testing tests the emergent behavior of a system.
Chapter 9 Software Verification and Validation
48
System and Component Testing
During system testing, reusable components that have been separately developed
and off-the-shelf systems may be integrated with newly developed components.
The complete system is then tested.
Components developed by different team members or sub-teams may be
integrated at this stage. System testing is a collective rather than an individual
process.
In some companies, system testing may involve a separate testing team with no
involvement from designers and programmers.
Chapter 9 Software Verification and Validation
49
Use-case Testing
The use-cases developed to identify system interactions can be used as a
basis for system testing.
Each use case usually involves several system components so testing the
use case forces these interactions to occur.
The sequence diagrams associated with the use case documents the
components and interactions that are being tested.
Chapter 9 Software Verification and Validation
50
Collect Weather Data Sequence Chart
Chapter 9 Software Verification and Validation
51
Testing Policies
Exhaustive system testing is impossible so testing policies which define the
required system test coverage may be developed.
Examples of testing policies:
All system functions that are accessed through menus should be tested.
Combinations of functions (e.g. text formatting) that are accessed through the
same menu must be tested.
Where user input is provided, all functions must be tested with both correct and
incorrect input.
Chapter 9 Software Verification and Validation
52
Regression Testing
Regression testing is testing the system to check that changes have not ‘broken’
previously working code.
In a manual testing process, regression testing is expensive but, with automated
testing, it is simple and straightforward. All tests are rerun every time a change is
made to the program.
Tests must run ‘successfully’ before the change is committed.
Chapter 9 Software Verification and Validation
53
Release Testing
Release testing is the process of testing a particular release of a system that is intended for
use outside of the development team.
The primary goal of the release testing process is to convince the supplier of the system
that it is good enough for use.
Release testing, therefore, has to show that the system delivers its specified functionality,
performance and dependability, and that it does not fail during normal use.
Release testing is usually a black-box testing process where tests are only derived from the
system specification.
Chapter 9 Software Verification and Validation
54
Release Testing And System Testing
Release testing is a form of system testing.
Important differences:
A separate team that has not been involved in the system development, should
be responsible for release testing.
System testing by the development team should focus on discovering bugs in
the system (defect testing). The objective of release testing is to check that the
system meets its requirements and is good enough for external use (validation
testing).
Chapter 9 Software Verification and Validation
55
Requirements Based Testing
Requirements-based testing involves examining each requirement and
developing a test or tests for it.
MHC-PMS requirements:
If a patient is known to be allergic to any particular medication, then
prescription of that medication shall result in a warning message being issued to
the system user.
If a prescriber chooses to ignore an allergy warning, they shall provide a reason
why this has been ignored.
Chapter 9 Software Verification and Validation
56
Requirements Tests
Set up a patient record with no known allergies. Prescribe medication for
allergies that are known to exist. Check that a warning message is not issued
by the system.
Set up a patient record with a known allergy. Prescribe the medication to that
the patient is allergic to, and check that the warning is issued by the system.
Set up a patient record in which allergies to two or more drugs are recorded.
Prescribe both of these drugs separately and check that the correct warning
for each drug is issued.
Prescribe two drugs that the patient is allergic to. Check that two warnings are
correctly issued.
Prescribe a drug that issues a warning and overrule that warning. Check that
the system requires the user to provide information explaining why the warning
was overruled.
Chapter 9 Software Verification and Validation
57
Features Tested By Scenario
Authentication by logging on to the system.
Downloading and uploading of specified patient records to a laptop.
Home visit scheduling.
Encryption and decryption of patient records on a mobile device.
Record retrieval and modification.
Links with the drugs database that maintains side-effect information.
The system for call prompting.
Chapter 9 Software Verification and Validation
58
A Usage Scenario For The MHC-PMS
Kate is a nurse who specializes in mental health care. One of her responsibilities is to visit patients at home to
check that their treatment is effective and that they are not suffering from medication side -effects.
On a day for home visits, Kate logs into the MHC-PMS and uses it to print her schedule of home visits for that
day, along with summary information about the patients to be visited. She requests that the records for these
patients be downloaded to her laptop. She is prompted for her key phrase to encrypt the records on the laptop.
One of the patients that she visits is Jim, who is being treated with medication for depression. Jim feels that
the medication is helping him but believes that it has the side -effect of keeping him awake at night. Kate looks
up Jim’s record and is prompted for her key phrase to decrypt the record. She checks the drug prescribed and
queries its side effects. Sleeplessness is a known side effect so she notes the problem in Jim’s record and
suggests that he visits the clinic to have his medication changed. He agrees so Kate enters a prompt to call
him when she gets back to the clinic to make an appointment with a physician. She ends the consultation and
the system re-encrypts Jim’s record.
After, finishing her consultations, Kate returns to the clinic and uploads the records of patients visited to the
database. The system generates a call list for Kate of those patients who she has to contact for follow-up
information and make clinic appointments.
Chapter 9 Software Verification and Validation
59
Performance Testing
Part of release testing may involve testing the emergent properties of a
system, such as performance and reliability.
Tests should reflect the profile of use of the system.
Performance tests usually involve planning a series of tests where the load
is steadily increased until the system performance becomes
unacceptable.
Stress testing is a form of performance testing where the system is
deliberately overloaded to test its failure behavior.
Chapter 9 Software Verification and Validation
60
User Testing
User or customer testing is a stage in the testing process in which users or customers
provide input and advice on system testing.
User testing is essential, even when comprehensive system and release testing have
been carried out.
The reason for this is that influences from the user’s working environment have a major effect
on the reliability, performance, usability and robustness of a system. These cannot be
replicated in a testing environment.
Chapter 9 Software Verification and Validation
61
Types of User Testing
Alpha testing
Users of the software work with the development team to test the software at the
developer’s site.
Beta testing
A release of the software is made available to users to allow them to experiment and to raise
problems that they discover with the system developers.
Acceptance testing
Customers test a system to decide whether or not it is ready to be accepted from the system
developers and deployed in the customer environment. Primarily for custom systems.
Chapter 9 Software Verification and Validation
62
The Acceptance Testing Process
Chapter 9 Software Verification and Validation
63
Stages In The Acceptance Testing Process
Define acceptance criteria
Plan acceptance testing
Derive acceptance tests
Run acceptance tests
Negotiate test results
Reject/accept system
Chapter 9 Software Verification and Validation
64
Key Points
When testing software, you should try to ‘break’ the software by using
experience and guidelines to choose types of test case that have been
effective in discovering defects in other systems.
Wherever possible, you should write automated tests. The tests are embedded
in a program that can be run every time a change is made to a system.
Test-first development is an approach to development where tests are written
before the code to be tested.
Scenario testing involves inventing a typical usage scenario and using this to
derive test cases.
Acceptance testing is a user testing process where the aim is to decide if the
software is good enough to be deployed and used in its operational
environment.
Chapter 9 Software Verification and Validation