0% found this document useful (0 votes)
230 views

Unitech Consultancy: Test Plan

This document provides a test plan for an online job portal system. It outlines 5 sections - introduction, test strategy, execution strategy, test management process, and test environment. The test strategy section defines the objectives, assumptions, principles, scope and levels of testing to be performed. Exploratory, functional, and user acceptance testing will be conducted. Functional testing will be done according to test scripts in HP ALM. The test management process section describes the test management tool, design process, execution process, risks, communications plan and team roles.

Uploaded by

Mohan Sai
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
230 views

Unitech Consultancy: Test Plan

This document provides a test plan for an online job portal system. It outlines 5 sections - introduction, test strategy, execution strategy, test management process, and test environment. The test strategy section defines the objectives, assumptions, principles, scope and levels of testing to be performed. Exploratory, functional, and user acceptance testing will be conducted. Functional testing will be done according to test scripts in HP ALM. The test management process section describes the test management tool, design process, execution process, risks, communications plan and team roles.

Uploaded by

Mohan Sai
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 21

TEST PLAN

UNITECH CONSULTANCY

Table of Contents

1 INTRODUCTION

1.1 Purpose

1.2 Project Overview

1.3 Audience

2 TEST STRATEGY

2.1 Test Objectives

2.2 Test Assumptions

2.3 Test Principles

2.4 Data Approach


2.5 Scope and Levels of Testing
2.5.1 Exploratory

2.5.2 Functional Test


TEST ACCEPTANCE CRITERIA
TEST DELIVERABLES

MILESTONE LIST

2.5.3 User Acceptance Test (UAT)

TESTDELIVERABLES

2.6 Test Effort Estimate


3 EXECUTION STRATEGY
3.1 Entry and Exit Criteria

3.2 Test Cycles

3.3 Validation and Defect Management


3.4 Test Metrics
3.5 Defect tracking &Reporting

4 TEST MANAGEMENT PROCESS


4.1 Test Management Tool
4.2 Test Design Process
4.3 Test Execution Process
4.4 Test Risks and Mitigation Factors
4.1 Communications Plan and Team Roster
4.2 Role Expectations
4.2.1Project Management

4.2.2 Test Planning (Test Lead)


4.2.3 Test Team
4.2.4 Test Lead
4.2.5 Development Team

5 TEST ENVIRONMENT

1 INTRODUCTION

1.1 PURPOSE:

This system can be used as an Online Job Portal for the Placements
providing to the un employees who are seeking for a job placement. Job
Seeker logging into the system and he can should be able to upload their
information in the form of a CV. Visitors/Company representatives logging
in may also access/search any information put up by Job Seeker.
1.2 PROJECT OVERVIEW

This project is aimed at developing an online search Portal for the


Placement Details for job seekers. The system is an online application that
can be accessed throughout the organization and outside as well with
proper login provided. This system can be used as an Online Job Portal for
job seekers. Job Seekers logging should be able to upload their
information in the form of a CV. Visitors/Company representatives logging
in may also access/search any information put up by Job aspirants.

1.3 AUDIENCE
 Project team members perform tasks specified in this document,
and provide input and recommendations on this document.

 Project Manager Plans for the testing activities in the overall


project schedule, reviews the document, tracks the performance of
the test according to the task herein specified, approves the
document and is accountable for the results.

 The stakeholders’ representatives and participants (individuals


as identified by the PMO Leads) may take part in the UAT test to
ensure the business is aligned with the results of the test.

 Technical Team ensures that the test plan and deliverables are
in line with the design, provides the environment for testing and
follows the procedures related to the fixes of defects.

 Business analysts will provide their inputs on functional


changes

2. TEST STRATEGY
2.1 TEST OBJECTIVES
The objective of the test is to verify that the functionality of
UNITECHCONSULTANCY - MODULE works according to the
specifications.

The test will execute and verify the test scripts, identify, fix and
retest all high and medium severity defects per the entrance
criteria, prioritize lower severity defects for future fixing via CR.

The final product of the test is twofold:

 A production-ready software;

 A set of stable test scripts that can be reused for Functional


and UAT test execution.

2.2. Test Assumptions


 Key Assumptions

Production like data required and be available in the system prior to


start of Functional Testing

In each testing phase, Cycle 3 will be initiated if the defect rate is


high in Cycle 2.

 General

1. Exploratory Testing would be carried out once the build is


ready for testing

2. Performance testing is not considered for this estimation.

3. All the defects would come along with a snapshot JPEG format

4. The Test Team will be provided with access to Test


environment via VPN connectivity

5. The Test Team assumes all necessary inputs required during


Test design and execution will be supported by
Development/BUSINESS ANALYSTS appropriately.
Functional Testing

During Functional testing, testing team will use preloaded


data which is available on the system at the time of execution
The Test Team will be perform Functional testing only on
UNITECHCONSULTANCY - MODULE

UAT

UAT test execution will be performed by end users (L1, L2and


L3) and QA Group will provide their support on creating UAT
script.

2.3. Test Principles

 Testing will be focused on meeting the business objectives, cost


efficiency, and quality.
 There will be common, consistent procedures for all teams
supporting testing activities.
 Testing processes will be well defined, yet flexible, with the ability
to change as needed.
 Testing activities will build upon previous stages to avoid
redundancy or duplication of effort.
 Testing environment and data will emulate a production
environment as much as possible.
 Testing will be a repeatable, quantifiable, and measurable activity.
 Testing will be divided into distinct phases, each with clearly
defined objectives and goals.
 There will be entrance and exit criteria.

2.4. Data Approach


In functional testing UNITECHCONSULTANCY - MODULE will contain
pre-loaded test data and which is used for testing activities

2.5. Scope and Levels of Testing

2.5.1. Exploratory

PURPOSE: the purpose of this test is to make sure critical defects are
removed before the next levels of testing can start.

SCOPE: First level navigation, dealer and admin modules

TESTERS: Testing team.

METHOD: this exploratory testing is carried out in the application


without any test scripts and documentation

TIMING: at the beginning of each cycle.

2.5.2. Functional Test

PURPOSE: Functional testing will be performed to check the functions of


application. The functional testing is carried out by feeding the input and
validates the output from the application.

Scope: The below excel sheet details about the scope of Functional test.
Note: The scope is high level due to changes in the requirement

To keep the document easily fragmented and categorized, the scope


has been embedded as separate document. If you prefer you can
insert a table here itself. The scope is created based on the Test
scenarios that were identified in the previous article

TESTERS: Testing Team.


METHOD: The test will be performed according to Functional scripts, which are stored I
n HP ALM.

TIMING: after Exploratory test is completed.

TEST ACCEPTANCE CRITERIA


1. Approved Functional Specification document, Use case documents
must be available prior to start of Test design phase.

2. Test cases approved and signed-off prior to start of Test execution

3. Development completed, unit tested with pass status and results shared
to Testing team to avoid duplicate defects

4. Test environment with application installed, configured and ready to use


state

TEST DELIEVERABLES:
S.No. Deliverable Name Author Reviewer

1. Test Plan Test Lead Project Manager/


Business Analyst’s

2. Functional Test Cases Test Team Business Analyst’s


Sign off
3. Logging Defects in HP ALM Test Team Test Lead/
Programming
Lead(shailes
h)

(4. Daily/weekly status report Test Team/ Test Lead Test Lead/ Project
Manager

5. Test Closure report Test Lead Project Manager


MILESTONE LIST

The milestone list is tentative and may change due to below reasons

a) Any issues in the System environment readiness


b) Any change in scope/addition in scope

c) Any other dependency that impacts efforts and timelines

2.5.3. User Acceptance Test (UAT)

PURPOSE: this test focuses on validating the business logic. It allows the
end users to complete one final review of the system prior to deployment.

TESTERS: the UAT is performed by the end users (L1, L2 and L3).

METHOD: Since the business users are the most indicated to provide input
around business needs and how the system adapts to them, it may happen
that the users do some validation not contained in the scripts. Test team
write the UAT test cases based on the inputs from End user (L1,L2 and L3
users) and Business Analyst’s.

TIMING: After all other levels of testing (Exploratory and Functional) are
done. Only after this test is completed the product can be released to
production.

S.No. Deliverable Name Author Reviewer

z 1. UAT Test Cases Test Team Business Analyst’s


Sign off
2.6. Test Effort Estimate

This document lists out all the activities that have to be performed by the QA team and
estimates how many man-hours each activity is going to take.

New_Detailed DRFT

3. Test estimate v1.xlsx

Note: this estimate is for the TCOE team onlyTesting Schedule

4. EXECUTION STRATEGY

4.1. Entry and Exit Criteria

 The entry criteria refer to the desirable conditions in order to start test execution; only the
migration of the code and fixes need to be assessed at the end of each cycle.

 The exit criteria are the desirable conditions that need to be met in order proceed with the
implementation.

 Entry and exit criteria are flexible benchmarks. If they are not met, the test team will assess
the risk, identify mitigation actions and provide a recommendation. All this is input to the
project manager for a final “go-no go” decision.

 Entry criteria to start the execution phase of the test: the activities listed in the Test
Planning section of the schedule are 100% completed.

 Entry criteria to start each cycle: the activities listed in the Test Execution section of the
schedule are 100% completed at each cycle.

4.2. Test Cycles

o There will be two cycles for functional testing. Each cycle will execute all the scripts .

o The objective of the first cycle is to identify any blocking, critical defects, and most of the

high defects. It is expected to use some work-around in order to get to all the scripts.
o The objective of the second cycle is to identify remaining high and medium defects, remove
the work-around from the first cycle, correct gaps in the scripts and obtain performance
results.

 UAT test will consist of one cycle.

4.3. Validation and Defect Management

 It is expected that the testers execute all the scripts in each of the cycles described above.
However it is recognized that the testers could also do additional testing if they identify a
possible gap in the scripts. This is especially relevant in the second cycle, when the Business
analyst’s join the TCOE in the execution of the test, since the BUSINESS ANALYSTs have a deeper
knowledge of the business processes. If a gap is identified, the scripts and traceability matrix will
be updated and then a defect logged against the scripts.

 The defects will be tracked through HP ALM only. The technical team will gather information on a
daily basis from HP ALM, and request additional details from the Defect Coordinator. The
technical team will work on fixes.

 It is the responsibility of the tester to open the defects, link them to the corresponding script,
assign an initial severity and status, retest and close the defect; it is the responsibility of the
Defect Manager to review the severity of the defects and facilitate with the technical team the fix
and its implementation, communicate with testers when the test can continue or should be halt,
request the tester to retest, and modify status as the defect progresses through the cycle; it is the
responsibility of the technical team to review HP ALM on a daily basis, ask for details if necessary,
fix the defect, communicate to the Defect Manager the fix is done, implement the solution per
the Defect Manager request.

ALM” and the categories are:

Severity Impact


1 (Critical) This bug is critical enough to crash the system, cause file corruption, or

cause potential data loss



It causes an abnormal return to the operating system (crash or a

system failure message appears).



It causes the application to hang and requires re-booting the system.

It causes a lack of vital program functionality with workaround.
2 (High)


This Bug will degrade the quality of the System. However there is an
3 (Medium)
intelligent workaround for achieving the desired functionality - for
example through another screen.

This bug prevents other areas of the product from being tested.

However other areas can be independently tested.



There is an insufficient or unclear error message, which has minimum
4 (Low)
impact on product use.

5(Cosmetic) There is an insufficient or unclear error message that has no impact on

product use.

4.4. Test Metrics

Test metrics to measure the progress and level of success of the test will be developed and shared
with the project manager for approval. The below are some of the metrics

Report Description Frequency

Test To report on % complete, %WIP, % Pass, % Fail Weekly / Daily


preparation (optional)
Defects severity wise Status – Open, closed, any other
&
Status
Execution
Status

Daily To report on Pass, Fail, Total defects, highlight Daily

execution Showstopper/ Critical defects

status

Project Project driven reporting (As requested by PM) Weekly – If project


Weekly team needs weekly
Status update apart from
report daily and there is

template available

4.5. Defect tracking & Reporting

Following flowchart depicts Defect Tracking Process:


5. TEST MANAGEMENT PROCESS

5.1. Test Management Tool

HP Application Lifecycle Management is the tool used for Test Management. All testing
artifacts such as Test cases, test results are updated in the HP Application Lifecycle
Management (ALM) tool.

 Project specific folder structure will be created in HP ALM to manage the status of this
DFRT project.

 Each resource in the Testing team will be provided with Read/Write access to
add/modify Test cases in HP ALM.

 During the Test Design phase, all test cases are written directly into HP ALM. Any
change to the test case will be directly updated in the HP ALM.
 Each Tester will directly access their respective assigned test cases and update the status of
each executed step in HP ALM directly.

 Any defect encountered will be raised in HP ALM linking to the particular Test case/test
step.

 During Defect fix testing, defects are re-assigned back to the tester to verify the defect fix.
The tester verifies the defect fix and updates the status directly in HP ALM.

 Various reports can be generated from HP ALM to provide status of Test execution. For
example, Status report of Test cases executed, Passed, Failed, No. of open defects, Severity
wise defects etc.

5.2. Test Design Process

Establishing Incorporating
SME /Peer
Understanding Traceability Preparation of Review
Review of Test
Requirements Matrix in HP Test cases comments in
cases
ALM test cases
 The tester will understand each requirement and prepare corresponding test case to
ensure all requirements are covered.

 Each Test case will be mapped to Use cases to Requirements as part of Traceability
matrix.

 Each of the Test cases will undergo review by the BUSINESS ANALYST and the review
defects are captured and shared to the Test team. The testers will rework on the review
defects and finally obtain approval and sign-off.

 During the preparation phase, tester will use the prototype, use case and functional
specification to write step by step test cases.

 Testers will maintain a clarification Tracker sheet and same will be shared periodically
with the Requirements team and accordingly the test case will be updated. The
clarifications may sometimes lead to Change Requests or not in scope or detailing
implicit requirements.

 Sign-off for the test cases would be communicates through mail by Business Analyst’s.

 Any subsequent changes to the test case if any will be directly updated in HP ALM.

5.3. Test Execution Process

Participate in
Execute each of Mark Status as Raise defects for Send the daily Complete the

Defect Triage
the test step in Pass/Fail in HP the failed test status report to test execution of

cycle and explain


test case ALM cases in HP ALM Test Lead all the test cases
the defects
 Once all Test cases are approved and the test environment is ready for testing,
tester will start a exploratory test of the application to ensure the application is
stable for testing.

 Each Tester is assigned Test cases directly in HP ALM.

 Testers to ensure necessary access to the testing environment, HP ALM for updating
test status and raise defects. If any issues, will be escalated to the Test Lead and in turn
to the Project Manager as escalation.

 If any showstopper during exploratory testing will be escalated to the respective
development SPOCs for fixes.

 Each tester performs step by step execution and updates the executions status.
The tester enters Pass or Fail Status for each of the step directly in HP ALM.

 Tester will prepare a Run chart with day-wise execution details

 If any failures, defect will be raised as per severity guidelines in HP ALM tool detailing
steps to simulate along with screenshots if appropriate.

 Daily Test execution status as well as Defect status will be reported to all stakeholders.

 Testing team will participate in defect triage meetings in order to ensure all test cases
are executed with either pass/fail category.

 If there are any defects that are not part of steps but could be outside the test steps,
such defects need to be captured in HP ALM and map it against the test case level or at
the specific step that issue was encountered after confirming with Test Lead.

 This process is repeated until all test cases are executed fully with Pass/Fail status.

 During the subsequent cycle, any defects fixed applied will be tested and results will
be updated in HP ALM during the cycle.

As per Process, final sign-off or project completion process will be followed


5.4. Test Risks and Mitigation Factors

Risk Prob. Impact Mitigation Plan

SCHEDULE  The testing team can control the


High High
Testing schedule is tight. If the start preparation tasks (in advance)

Page
14
Risk Prob. Impact Mitigation Plan

of the testing is delayed due to and the early communication


design tasks, the test cannot be with involved parties.
extended beyond the UAT  Some buffer has been added to
scheduled start date. the schedule for contingencies,
although not as much as best
practices advise.

RESOURCES Holidays and vacation have been


Not enough resources, resources on estimated and built into the
boarding too late (process takes Medium High schedule; deviations from the
around 15 days. estimation could derive in delays in
the testing.

DEFECTS Defect management plan is in place


Defects are found at a late stage of to ensure prompt communication
the cycle or at a late cycle; defects and fixing of issues.
discovered late are most likely be
Medium High
due to unclear specifications and
are time consuming to resolve.

SCOPE Scope is well defined but the


Scope completely defined changes are in the functionality are
Medium Medium
not yet finalized or keep on
changing.

Natural disasters Teams and responsibilities have


been spread to two different
geographic areas. In a catastrophic
Low Medium event in one of the areas, there will
resources in the other areas needed
to continue (although at a slower
pace) the testing activities.

Non-availability of Independent Test Due to non availability of the


environment and accessibility environment, the schedule gets
Medium High
impacted and will lead to delayed
start of Test execution.

Delayed Testing Due To new Issues During testing, there is a good


chance that some “new” defects
may be identified and may become
an issue that will take time to
resolve.
There are defects that can be raised
Medium High
during testing because of unclear
document specification. These
defects can yield to an issue that will
need time to be resolved.
If these issues become
showstoppers, it will greatly impact

5.1. Communications Plan and Team Roster

5.2. Role Expectations

The following list defines in general terms the expectations related to the roles
directly involved in the management, planning or execution of the test for the
project.

Roles Name Contact Info

1. Project Manager
2. Test Lead

3. Business Analyst

4. Development Lead

5. Testing Team

6. Development Team

7. Technical Lead

5.2.1. Project Management

 Project Manager: reviews the content of the Test Plan, Test Strategy and
Test Estimates signs off on it.

5.2.2. Test Planning (Test Lead)

 Ensure entrance criteria are used as input before start the execution.

 Develop test plan and the guidelines to create test conditions, test
cases, expected results and execution scripts.

 Provide guidelines on how to manage defects.

 Attend status meetings in person or via the conference call line.

 Communicate to the test team any changes that need to be made to the
test deliverables or application and when they will be completed.

 Provide on premise or telecommute support.

 Provide functional (Business Analysts) and technical team to test


team personnel (if needed).
5.2.3. Test Team

 Develop test conditions, test cases, expected results, and execution


scripts.

 Perform execution and validation.

 Identify, document and prioritize defects according to the guidance
provided by the Test lead.

 Re-test after software modifications have been made according to
the schedule.

 Prepare testing metrics and provide regular status.

5.2.4. Test Lead

 Acknowledge the completion of a section within a cycle.



 Give the OK to start next level of testing.

 Facilitate defect communications between testing team and
technical / development team.

5.2.5. Development Team

 Review testing deliverables (test plan, cases, scripts, expected


results, etc.) and provide timely feedback.

 Assist in the validation of results (if requested).

 Support the development and testing processes being used to
support the project.

 Certify correct components have been delivered to the test
environment at the points specified in the testing schedule.

 Keep project team and leadership informed of potential software
delivery date slips based on the current schedule.

 Define processes/tools to facilitate the initial and ongoing migration
of components.

 Conduct first line investigation into execution discrepancies and
assist test executors in creation of accurate defects.

 Implement fixes to defects according to schedule.

6. TEST ENVIRONMENT

UNITECHCONSULTANCY MODULE’s servers will be hosted at X company’s site.


UNITECHCONSULTANCY MODULE will be hosted on two servers: One to host the actual
website and (language) code, and the other to host the (database name) database.

A windows environment with Internet Explorer 8, 9 and 10, and with Firefox 27.0, as
well as Google Chrome32.0 and later should be available to each tester.

You might also like