0% found this document useful (0 votes)
116 views

Test Plan V3.2

This test plan outlines the testing procedures for validating the functionality of The Knowledge Box. Key aspects of the test plan include: 1) Features to be tested are listed, including login/logout, viewing calls, adding/editing calls, and help documentation. 2) The testing approach covers feature testing, regression testing, installation testing, backup/restore testing, usability testing, and scenario testing. 3) Pass/fail criteria define tests as passing if the expected result is achieved. Major bugs must be fixed for the product to pass testing. Test deliverables and responsibilities are also defined at a high level. The document provides an overview of the intended testing strategy and scope to validate that The Knowledge Box

Uploaded by

ze-ez
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
116 views

Test Plan V3.2

This test plan outlines the testing procedures for validating the functionality of The Knowledge Box. Key aspects of the test plan include: 1) Features to be tested are listed, including login/logout, viewing calls, adding/editing calls, and help documentation. 2) The testing approach covers feature testing, regression testing, installation testing, backup/restore testing, usability testing, and scenario testing. 3) Pass/fail criteria define tests as passing if the expected result is achieved. Major bugs must be fixed for the product to pass testing. Test deliverables and responsibilities are also defined at a high level. The document provides an overview of the intended testing strategy and scope to validate that The Knowledge Box

Uploaded by

ze-ez
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 20

Test Plan

V3.2
For
The Knowledge Box

Prepared By: S. Reuben


Representing: Team Profound System Solutions
Submitted To: Dr. Joo Tan
Date: April 12, 2007

Test Plan
I. Table of Contents..
Revision History...
1.0 Introduction
2.0 Test Items...
3.0 Features to Be Tested.
4.0 Features Not to Be Tested..........................
5.0 Approach
5.1 Feature Testing...........................
5.2 Regression Testing.
5.3 Product Installation
5.4 Backup and Restore
5.5 Usability Testing
5.6 Scenario Testing.
6.0 Pass/Fail Criteria........................
7.0 Defect States...............................................................................................
8.0 Criteria and Resumption Requirements.
9.0 Test Deliverables
10.0 Testing Tasks
11.0 Test Configuration Information...
12.0 Responsibilities........................
13.0 Entrance Criteria......................................................................................
14.0 Staffing and Training Needs....
15.0 Schedule...
16.0 Risks and Contingencies..
17.0 References........................

i
ii
1
2
3
4
5
5
5
5
5
5
5
6
7
8
9
10
11
12
13
14
15
16
17

Revision History

Version
1.0
1.1
2.0
2.1
2.2

Primary
Author(s)
J. Dimino
D. Mote
J. Dimino
D. Mote
J. Dimino
D. Mote
J. Dimino
D. Mote
J. Dimino
D. Mote

3.0

S. Reuben

3.1

S. Reuben

3.2

S. Reuben

Description of Version
Initial draft created for submission and review by
upper management.
Changes made to all sections following review by
Upper Management
Changes made to Sections 1, 2, 3, 4, 5, 7, 8, 10, 11,
12, 13, 14 following Test Plan Review Meeting
held on 2/6/07.
Changes made to Sections 1, 2, 3, 4, 5, 6, 8, 9, 10,
12, 14 and 14.
Changes made to Table of Contents, Sections 1, 2,
4, and 5. Added Sections 7 and 13. Changes made
to Sections 14, 16, and 17.
Initial re-draft and updates for second iteration of
development and testing
Changes made to sections: 1, 3, 4, 5.1-4, 7, 8, 9,
10, 11, 12, 13, 14, 16, 17.
Changes made to sections: 3, 5.5
Added section 5.6 Scenario Testing

Date Completed
February 1, 2007
February 3, 2007
February 7, 2007
February 22, 2007
March 5, 2007
March 26, 2007
March 28, 2007
April 12, 2007

ii

1.0

Introduction

The Knowledge Box Project was first started in September of 2006. The purpose of this
document is to outline the test procedures designed to validate the functionality of The Knowledge
Box. This document will define the test strategy, the test system, and provide an estimate of testing
effort. (The requirements to be tested are specified in the Business Requirement I document [1]. The
testing team will thoroughly test the system to the greatest extent possible in the time limit.)

2.0

Test Items

The following is a high-level list of the product components that are addressed by this test plan:
Releases to be tested This test effort will address the feature functionality of The Knowledge
Box. The version tested will be the latest version of the project provided to us by the
development team on April 3rd 2007 and April 17th 2007.
Bug Fixes This is not the first release of this project and some bugs were found in previous
versions. As such, regression testing will be done. All additional bugs that are found during
this testing process will be documented and passed on to the developers.
Distribution Media The latest release of this project will be available to authorized users on
the Yahoo Group for CIS480 [2]

3.0

Features to Be Tested

The following features will be tested to ensure that The Knowledge Box satisfies all
functionality as specified in the Business Requirements I and per the request of the client and approval
of the development team will be included and tested after the intermediate release.
RS 3.2.1.1 - Admin login
RS 3.2.1.2 - Tech login
RS 3.2.2.1 - Tech logout
RS 3.2.2.2 - Admin logout
RS 3.2.3 - Admin add an account
RS 3.2.4.1 - Tech view all active calls
RS 3.2.4.2 - Admin view all active calls
RS 3.2.5.1 - Tech view all inactive calls
RS 3.2.5.2 - Admin view all inactive calls
RS 3.2.6.1 - Tech view an individual call
RS 3.2.6.2 - Admin view an individual call
RS 3.2.7 - Admin back-up database
RS 3.2.8 - Active and inactive calls sorted
RS 3.2.9 - Admin add a new call
RS 3.2.10 - Admin edit an existing call
RS 3.2.11.1 Tech open a call
RS 3.2.11.2 - Admin open a call
RS 3.2.12.1 - Tech close a call
RS 3.2.12.2 - Admin close a call
RS 3.2.13 - Admin approve a call
RS 3.2.14 - Help documents accessible

4.0

Features Not to Be Tested

The following features will not be tested in the current release of the system:

GUI Performance
Security Performance
Load/Stress Performance

5.0

Approach

The overall testing approach includes feature testing, regression testing, product installation
testing, backup/restore testing, usability testing, and scenario testing. Each type of testing is described
in more detail in the following sub-sections.

5.1

Feature Testing

All features described in the requirements definition, also called Business Requirements I, and
per client request will be tested on the configuration described in Section 11. Feature testing will
include functional testing and both positive and negative testing [3].

5.2

Regression Testing

We want to make sure that bugs fixed in this test cycle do not break previously working
functionality in an earlier release.
Bugs are to be fixed as they are found. For each software build that is released to the test team,
tests will be run to verify that the bugs fixed in that build do not reoccur.
Once the product is stable and test cases proven, there will be one final regression pass before
formal testing. For this release, all test cases must be include in order to pass regression. To
pass regression will also require that 100% of major bugs, at least 90% of moderate bugs, and
at least 25% of minor bugs be fixed.

5.3

Product Installation

Each software build released to the test team will be installed in accordance with the procedure
specified in the Development Plan, Section 3.5 System Delivery Procedure [4].

5.4

Backup and Restore

The backup functionality for The Knowledge Box has been implemented and will be tested
accordingly.

5.5 Usability Testing


The system will be tested for usability by verifying that the interface is easy to navigate and that all
items are properly labeled. The following areas will be evaluated and tested:
RS 3.2.15 - Help documentation accurate
RS 4.0.3.1 - Consistency guidelines
RS 4.0.3.2 - Visibility guidelines
RS 4.0.3.3 - Affordance guidelines
RS 4.0.3.4 - Closure guidelines

5.6 Scenario Testing


The system will also be tested using various real-to-life scenarios. This will verify workflow
and coordination of functions:
RS 3.2.16 Complete and approve call scenario
RS 3.2.17 Delete active user scenario
RS 3.2.18 Add new admin scenario
RS 3.2.19 Admin sort scenario
RS 3.2.20 Add numerous new calls scenario
5

6.0

RS 3.2.21 Add new call with an error scenario

Pass/Fail Criteria

The pass/fail criterion for each test case is described by its expected result. If the expected
result is obtained when a test case runs, the test passes. If the expected result is not obtained when the
test is run, then the test fails. If the result is the expected result but causes other problems later, an MR
will be filed and dealt with by the development team. In order for the Knowledge Box product to
successfully exit the system test phase, 100% of the test cases defined in the Test Specification will
need to be run and obtain a passing grade [5].
The following three severity classifications will be used for all bugs found in the product:

Major product is unusable, major functionalities do not work. Examples are cannot add call,
cannot log on.
Moderate bugs that can be worked around but do not affect major functionality
Minor enhancements and nuisance bugs

All bug classification is subject to review and interpretation by the MR Board.

7.0

Defect States

For the purpose of tracking the status of all bugs, bug states are defined. The six bug states are
defined as follows:

New a defect that has not been acknowledge or addressed by developers


Defer a defect that will be postponed and addressed at a later date
Open a defect that is currently being worked on by the Development Team
Fixed a defect that has been fixed by the Development Team and is ready for re-testing
Closed the Test Team has tested the fix using the same test configuration and test procedure
that was used to find the defect originally
Trash a defect that is not valid or a defect that will not be addressed

8.0

Suspension Criteria and Resumption Requirements

If fundamental functionality, such as the ability to install and run the program, does not work,
testing will be suspended until the functionality is available. Attempts will be made to continue in the
face of catastrophic bugs unless the bugs are so severe that 50% or more of the test cases are blocked.
If testing is suspended due to improperly functioning code, the development team will have 48 hours to
get the system operational again.

9.0

Test Deliverables

The following items are work products that will be output of the testing process:
Traceability Matrix
Test Plan
Test Design
Test Specification
Modification Requests Report
Test Status Report
Test Metrics Report
Test Log Reports
Bug Status Report
Please refer to the WBS for deliverable dates [6].

10.0 Testing Tasks


The tasks to be performed during the test cycle for the Knowledge Box application are listed
below. The following Table 10.1 states the number of calendar days for each item.
Perform Product Installation Testing
Test Admin/Tech Coordination Features
Test all Administrator User features
Test all Technician User features
Test Previously Reported Bugs
Review/Test Help Documentation
Write MR Report
Write Test Status Report
Write Test Metrics Report
Write Test Logs
Write Bug Status Report
Table 10.1: Estimated efforts for the 1st & 2nd Test Cycles 1.

Task
Product Installation Testing
Test Admin/Tech Coordination Features
Test Admin User Features
Test Tech User Features
Test Previously Reported Bugs
Review/Test Help Documentation
Write MR Report
Write Test Status Report
Write Test Metrics Report
Write Test Logs
Write Bug Status Report

Time (days per week of testing)


1
2
2
2
2
2
2
2
2
2
2

Please refer to Gantt Chart for specific report dates [7].

10

11.0 System Test Configuration


For the test process, we will use the KU Computer Lab in Lytle Hall. The Knowledge Box is
designed to run on Windows XP. It was created using Visual Basic 2005. The test product will be in
the form of an executable (.exe) program which the testers will download from the Yahoo group. The
use of one shared drive for all users to connect is being coordinated.

11

12.0 Responsibilities
Both the Development and Test Teams have responsibilities for ensuring the quality of the intended
product.
Development Team responsibilities include:

Unit Test Features as they are developed


Fix 100% of major bugs and at least 90% of moderate bugs. Minor bugs will be fixed if time
allows.
Perform Integration Testing on features before they are packaged in a build for the test team
Prepare the application for delivery to the test team according to the WBS
Fix bugs submitted via MRs

Test Team responsibilities include:

Run planned tests and create MR reports for developers


Prepare a MR Summary Report on bugs found
Prepare a Test Status Report & Test Metrics at the end of the testing phase
Prepare Test Logs and a Bug Status Report

12

13.0 Entrance Criteria


Before the Test Team will accept the system from the Development Team the following Entry
Requirements must be met.
Entrance Criteria include:
Development unit and integration testing completed
System must install properly without undesirable side effects (see Section 8.0)
100% of Major bugs found in development testing must be fixed

13

14.0 Staffing and Training Needs


No special training is needed by the testers, however, developers must provide installation
instructions to testers. Testers will test according to functionality as outlined in the Test Specification
and will record bugs using the MR Form [5, 8]. Testers are also required to write a test log at the end
of each testing cycle. Table 14.1 details general testing responsibilities and areas.
Table 14.1: Assignments & Responsibilities

Person
Johnathan Dimino
Matt Kinsinger
David Mote
Thomas Gudz
Shneva Reuben

Position
Test Lead
Tester
Tester
Tester
Tester

All

Testers

Responsibility
Test Admin/Tech Coordination Features
Test Tech Features
Test Backup and Admin Features
Test Reported Bugged Admin Features
Test Reported Bugged Tech Features
Help Documentation

14

15.0 Schedule
It is expected that all unit and integration testing done by the development team will occur prior to the
start of Test Cycle #1 and that all Major bugs found in that testing will be fixed prior to the start of
system testing. Table 15.1 provides brief starting and ending dates for tasks during each cycle.
Table 15.1: Task Start & End Dates

Task
Test Cycle #1
Smoke Test
Summarize MR Reports
Test Cycle #2
Smoke Test
Summarize MR Reports
Summarize Test Results

Start Date
4/3
4/3
4/13
4/17
4/17
4/20
4/23

End Date
4/13
4/3
4/13
4/20
4/17
4/26
4/26

15

16.0 Risks and Contingencies


The following Table 16.1 outlines possible risks and contingency plans affecting the testing of the
Knowledge Box system:
Table 16.1: Risks and Contingencies

Risk
Hardware/Software required
are not available at testing
time.

Likelihood Impact
25%
Major

Requirements change during 25%


test development.

Moderate

Test cases are not ready.

25%

Moderate

User interface changes


during test development or
at start of test.
Inclement weather forces a
delay in schedule

25%

Moderate

25%

Moderate

People scheduled to test are


not available at test time.

25%

Minor

Personnel do not give 100%


effort to the project

25%

Moderate

Test cases in Test


Specification do not
correspond to System
Interface delivered
Test documents are
inconsistent

50%

Moderate

25%

Minor

Test documents are not


completed on time

25%

Minor

Mitigation
The testing will be done using the Lytle
Hall computer room. If this room is not
available, other backup labs should be
found and reserved.
Requirements should be constantly
reviewed and test cases need to be revised
if requirements change
Testing of available test cases should be
performed while other test cases are
finalized
If interface changes, development team
needs to inform test team of any changes.
If people are not available due to weather,
others from the team may be asked to help
in the testing phase. If the university
closes, testers may need to download the
system at home to perform testing.
If people are not available, then we should
consider asking another team member to
assist in testing.
If members are not giving 100% to the
project, this should be brought to the
attention of Upper Management and the
Lead Tester. Upper Management and test
Lead will address the issue and take
necessary action.
Test Cases may need to be modified to the
System Interface that is delivered.
Documents will be review and must be
updated according to the schedule outlined
in the WBS [6].
The documents must be completed within
24 hours and all team members notified.

16

17.0 References
[1] Business Requirements I v3.2-1, February 6, 2007, (update by) Edward
Dunkelberger.
[2] http://tech.groups.yahoo.com/group/cis480_SelectTopics/files/. March 28, 2007
[3] Rapid Testing, 2002 Robert Culbertson, Chris Brown, Gary Cobb, by Prentice Hall PTR.
[4] Development Plan v4.2, March 27, 2007, Mike Herring (Developer).
[5] Test Specification, April 4, 2007, David Mote (Tester).
[6] WBS_iter2_v1.5(3) - Work Breakdown Structure, March 25, 2007, Judy Haas
(Project Manager).
[7] Gantt Chart v1.0 iter2, March 26, 2007, Judy Haas (Project Manager).
[8] MR Form, March 22, 2007, Shneva Reuben (Tester).

17

You might also like