QA Process Document
QA Process Document
AppLabs Technologies
1700 Market Street, Suite 1406 Philadelphia, PA 19103
Phone: 215.569.9976 Fax: 215.569.9956
Quality Assurance Process Document
TABLE OF CONTENTS
1.0 Approach..................................................................................................................................... 3
1.1 Overview..................................................................................................................... 3
1.2 Definition of Scope ..................................................................................................... 3
1.3 Tests to be Conducted................................................................................................. 3
1.4 Tools Utilized.............................................................................................................. 5
1.5 Project Inception Checklist ......................................................................................... 5
2.0 Test Plans .................................................................................................................................... 6
2.1 Test Plan Creation Process.......................................................................................... 6
2.2 Test Plan Structure...................................................................................................... 7
2.3 Sample Test Plan......................................................................................................... 9
3.0 Test Plan Execution .................................................................................................................. 10
3.1 Manual Execution Process........................................................................................ 11
3.2 Reporting Details ...................................................................................................... 12
3.3 Handling Procedures................................................................................................. 13
3.4 Severity Levels.......................................................................................................... 14
4.0 Fix Validation Process .............................................................................................................. 15
5.0 Reporting................................................................................................................................... 17
5.1 Daily Summary Data................................................................................................. 17
5.2 Weekly Summary Report.......................................................................................... 18
5.3 End of Cycle Reports ................................................................................................ 18
6.0 Compatibility Test .................................................................................................................... 19
6.1 Browser Checks ........................................................................................................ 19
6.2 Operating Systems .................................................................................................... 19
7.0 Automated Testing.................................................................................................................... 19
8.0 Stress, Load and Capacity Testing............................................................................................ 19
8.1 Load Testing ............................................................................................................. 20
8.2 Stress Testing ............................................................................................................ 20
8.3 Capacity Testing ....................................................................................................... 20
8.4 Reporting Terms ....................................................................................................... 20
9.0 Minimal Acceptance Tests........................................................................................................ 21
10.0 Communication....................................................................................................................... 21
2
Quality Assurance Process Document
1.0 Approach
1.1 Overview
AppLabs Technologies provides offshore and on-site quality assurance services. We have a
dedicated staff of QA professionals to perform test plan creation, script development for automated
testing, manual and automated test plan execution, and stress, load and capacity testing.
The following pages outline our overall, fundamental approach to QA. While each project
varies we recommend, and utilize the following procedures outlined in this document.
Areas to be tested
• New System Modules
• Modified System Modules
• Unchanged System Modules
Environment Requirements
• Connection thru: [VPN, Firewall Access, DNS]
• DB Access Required & Provided [Y/N]
• Platforms Needed [Win 95, Win 98, Win NT, Win ME, Win 2000,
Solaris, Mac 8.6, Mac 9, Mac 10]
• Browser Types Needed [IE 4.0, IE 4.1, IE 5, IE 5.5, NS 4.08, NS 4.75, NS
6.0]
Functional Tests
A set of functional tests will developed based on client documentation, and review of any
existing systems. Dedicated QA Engineers will execute these functional tests manually.
Compatibility Tests
AppLab’s QA lab provides for testing to be conducted on multiple browsers and
operating systems. The desired testing combinations and process (including priority of
testing) will be determined during the scope definition process.
Automated Testing
AppLabs has the capacity to devote resource(s) dedicated to developing automated test
scripts. The areas of the application where automated testing are applied will be
identified by AppLabs and the client during the scoping phase. Typically, areas which
remain relatively constant and that are considered reasonably stable are considered
strong candidates for automated testing.
4
Quality Assurance Process Document
For each QA project, AppLabs will provide access to our recommended bug tracking
application, Bugzilla. If our client is already utilizing a bug tracking application, AppLabs will
instead enter bugs directly into that system. AppLabs has considerable experience with the
following bug tracking applications:
• Teamshare’s Teamtrack
• Soffront’s TrackWeb
• Softwise’s PR-Tracker
• GNATs Bug Tracking
Client:
Project Name:
Main Area Item Onus Date Info
Bug Tracking Web Based System Available to All Parties AppLabs
Client Users Created AppLabs
Client Documentation
The first step in developing test cases is receiving and reviewing the Functional
Requirements Documents provided by the client. The FRD review process includes: of
FRDs:
6
Quality Assurance Process Document
Receipt of Client
Documentation
Detailed Test
Client Yes
Cases Prepared &
Approves?
Sent To Client
No
No No
No
Client
Approves?
7
Quality Assurance Process Document
Front Page
Each test plan contains a first page, which contains the following for each section and for
the entire test plan: # Test Cases, # Cases Completed, # Passed, # Failed, # Blocked, %
Completed, % Passed, % Failed, % Blocked.
Column Headings
Each detailed test scenario (test case) has the following column headings: Test Case ID,
Status, Defect #, Test Case Area, Test Steps, Expected Results
Test Case
Status Defect # Test Area Reference Id Test Steps Expected Result
id
Test Areas
All AppLab’s test plans are divided into sections. Sections consist of groupings of test
scenarios. Examples of individual test plan sections include: Registration, Login, User
Rights & Privileges,
8
Quality Assurance Process Document
9
Quality Assurance Process Document
10
Quality Assurance Process Document
Test Case
Executed
Fail
Tester Prepares
"F" entered in
Reason for Info Question along
Status Column of
Blocked? Needed with Test Case
Test Plan
Number
Blocked by Bug
Bug Number
Bug Entered into Question Sent to
Entered in Defect
BTS Client in Daily
Column of Test
Status Report
Plan
Bug ID Number
from BTS Entered Blocked Report
into Test Plan Sent in Daily
Against Failed Status Report
Test Case
Executed Test
Test Plan Main Test Plan
Section Summary Section Plans Sent As
Summary Summary Data
Reviewed for Summaries Attachment w/
Reviewed for Sent in Daily
Accuracy Combined Daily Status
Accuracy Status Report
Report
11
Quality Assurance Process Document
o Browser(s)
o Type (bug, enhancement)
o Severity (as reported by QA engineer)
o Detailed Description – Elements (with date/personnel stamp)
! Login: provide user name, password, access rights, group etc..
! Landing Page: Define page you are taken to.
! Navigation: Define link/selection area & link/selection chosen
! Navigation: Define landing page, describe page (continue 3 & 4 as
required)
! Issue: Identify the outcome of the sequence
! Issue Clarification: Identify the expected results
! Issue Exposure: Identify how this impacts every area - other screens,
database etc.
! Issue Wrap-up: Specific Details eliminating variables or providing
developers insight into problem
! Testing Wrap-up: when this was last tested, when it last worked, how it
last worked etc.
! Final Emphasis: Specific Environments, Specific User Groups, anything
previously stated which needs more emphasis
! Screen shots where/when applicable
The below flow is used when a problem issue is found, while executing the test plans. The diagram
is not a flow designed to review fixed bugs. That flow is outlined in the "Fix Verification” section of
this document.
13
Quality Assurance Process Document
No Finished
No
Bug ID in
"F" entered in
Defect Yes Date, Build # Re-Opened
Status Column
Column? Bug Re- & Comments List Sent to
Fixed? Yes
Opened in Resolution Client in Daily
Field Status Report
No
No This will be based on Discussion w/
Client. Will depend on whether Bug was
"F" entered in Bug Re- closed during "this" testing cycle AND
Status Column Opened (or)
Closed? Yes process to close specific bug.
New Bug
Entered
No
Issue Remains
Bug Added to "Not a Tester Closed
BTS Rejected? Yes Bug"? Yes Review & Test Plan is
Agree? Yes
Updated to Reflect
Issue as "not a
bug" (if applicable
No
Yes
No
Bug ID Bug Annotated &
Entered in Bug Identified in Client Bug Re-
Yes
Defect Column Daily Status Agrees? Opened
Report
Duplicate? No
Bug
No Remains
No
Issue Closed
Retested
Again (on
Cannot Differnent Still
Yes Yes Bug Annotated
Reproduce? combos / Exists? in BTS
Case Re-
examined
No
14
Quality Assurance Process Document
15
Quality Assurance Process Document
Generate Excel
Distribute Bugs for
Query for "Fixed" Table to List Bugs
Review to Tester Reviews
Bugs by Severity and
Appropiate Tester
Appliction Section
Result?
Issue no longer
Original Bug no longer Occuring, Occuring
but Expected Result not obtained Tester Obtains
Expected Result
Issue is ReOccuring
High Low
Bug Status
Changed to Open
in BTS
NEW Bug Entered
Date, Build #, Details of Details of
into BTS
Tester, New Spreadsheet sent Spreadsheet sent
results added in to Client in Daily to Client in Daily
Resolution Column Status Report Status Report
Bug Status
Details of
Changed to
Spreadsheet sent
Closed in BTS
to Client in Daily
Date, Build #,
Status Report
Tester, New Bug
Number added as
Reference
16
Quality Assurance Process Document
5.0 Reporting
AppLabs provides daily, weekly and end of cycle reporting throughout the entire QA cycle.
17
Quality Assurance Process Document
Login / Registration
Outstanding Week Review
High 5 Closed / Fixed 8
Medium 4 New 1
9
Create New Patient
Outstanding Week Review
High 1 Closed / Invalid 2
Change 1 New 2
Low 1
Block 1
4
Call in Patient
Outstanding Week Review
High 1 Closed / Fixed 1
Medium 1 Closed / Duplicate 1
Low 4 Closed /Fixed 4
Change 3 New 3
9
Pre-Triage Patient
Outstanding Week Review
Medium 1 Closed / Fixed 1
Change 3 New 3
4
Triage Patient
Outstanding Week Review
Severe 2 Closed / Fixed 10
High 3 Closed /Invalid 1
Medium 2 New 19
Low 16
Change 8
31
18
Quality Assurance Process Document
1 Linux
2 Solaris 2.6, 5.6, 7.2, 8.0
3 Various Windows Versions including: Win 95,
Win 98, Win NT, Win ME, Win 2000
4 Multiple MAC Versions including Mac 8.6,
Mac 9, Mac 10
19
Quality Assurance Process Document
Load size
This is the number of concurrent Virtual Clients trying to access the site.
Throughput
The average number of bytes per second transmitted from the ABT (Application being
tested) to the Virtual Clients running this Agenda during the last reporting interval
Round Time
It is the average time it took the virtual clients to finish one complete iteration of the agenda
during the last reporting interval.
Transaction Time
The time it takes to complete a successful HTTP request, in seconds. (Each request for
each gif, jpeg, html file, etc. is a single transaction.) The time of a transaction is the sum of
the Connect Time, Send Time, Response Time, and Process Time.
Connect Time
The Time it takes for a Virtual client to connect to the Application Being Tested.
Send Time
The time it takes the Virtual Clients to write an HTTP request to the ABT (Application being
tested), in seconds.
Response Time
The time it takes the ABT (Application being tested) to send the object of an HTTP request
back to a Virtual Client, in seconds. In other words, the time from the end of the HTTP
request until the Virtual Client has received the complete item it requested.
Process Time
The time it takes WebLoad to parse an HTTP response from the ABT (Application being
tested) and then populate the document-object model (the DOM), in seconds.
Receive Time
20
Quality Assurance Process Document
This is the elapsed time between receiving the first byte and the last byte.
10.0 Communication
Ongoing communication is conducted between our dedicated QA team and our clients. The
communications mechanisms employed by our QA unit are listed below.
• Daily Emails
• Weekly Conference Calls
• Weekly yahoo or other online chat sessions
• Weekly status reports containing entire weeks completed work and projections for upcoming
week
• End of Cycle Reports
• Any additional recommendations/requirements suggested by client.
21