Software Testing Manual
Software Testing Manual
PRACTICAL RECORD
BONAFIDE CERTIFICATE
REGISTER NUMBER
Certified that this is the bonafide record work done by Mr./Miss.
Mission
1. To empower students by imparting a comprehensive education in the field of Artificial Intelligence
and Data Sciences, cultivating a profound grasp of fundamental principles and methodologies in
cutting-edge technologies.
2. To cultivate a strong connection with industries to encourage partnerships in technology training,
facilitate internships, and prepare students to become industry-ready professionals.
3. To encourage and endorse students' involvement in research and entrepreneurship with an
innovative approach that effectively tackles real-world challenges.
PEO 1 Graduates will utilize their foundational knowledge in basic sciences, mathematics,
Artificial Intelligence, data science, and statistics to design and build systems for managing
and analyzing large volumes of data.
PEO 2 Advance their technical skills to conduct pioneering research in AI and Data Science,
creating disruptive and sustainable solutions to address critical challenges in ecosystems and
various domains.
PEO 3 Think logically, engage in lifelong learning, collaborate ethically within multidisciplinary
teams, and contribute innovative ideas to drive economic growth.
PSO 1 Utilize foundational knowledge in AI, data science, and statistics to design efficient,
domain-specific processes for effective decision-making in areas such as business,
governance, and various industries.
PSO 2 Apply AI and data analytics to extract insights from data, create disruptive solutions, and
address complex societal, environmental, and engineering challenges using cutting-edge
technologies.
PSO 3 Develop skills in data analytics, knowledge engineering, and visualization to manage
complex projects, collaborate in multidisciplinary teams, and continuously contribute to
economic and societal progress through innovative solutions.
2
PO Program Outcome(s)
PO 01 # Engineering knowledge
PO 02 # Problem analysis
PO 03 # Design/development of solutions
PO 08 # Ethics
PO 10 # Communication
PO 12 # Life-long learning
3
Dos and Don’ts
Laboratory Rules & Regulations:
Students have to sign the log-book, while entering and leaving the Lab and also they have to mention
the time in and timeout.
Students have to enter and leave the Lab in their scheduled time otherwise they will be marked absent.
Students should come with proper Lab uniform and with shoes.
The students should properly shutdown the Computer Systems before they leave the Lab.
Students are not allowed to use CD’s& DVD’s, USB DRIVE etc. lf required prior permission of
Laboratory in-charge is needed.
All students will be responsible for keeping the Lab clean.
Students should refrain from dislocating, shifting and damaging with any parts of the computer or
any other device in the Lab.
The students should not load or delete any software from the computer.
The students should not use computers in the Lab for any personal work.
Browsing of Internet will not be allowed in the lab beyond the stipulated hour as per timetable.
The Instructor/Lecturer will be the sole authority to judge the disciplinary behavior inside the
laboratory. For violation of any of the above rules, the department reserves the right to take appropriate
disciplinary action.
Browsing of non-academic Internet sites will not be allowed in the Lab.
Before downloading any materials please consult our instructor and the downloaded files as per
instruction given by the laboratory in-charge.
Because of security problems, downloading software and music etc .from the Internet is strictly
prohibited. Any such file found in the hard disk will be deleted without warning.
Students should arrange the chairs properly while leaving the LAB hours If required prior permission
of Laboratory in-charge and Department incharge is needed.
4
SYLLABUS
COURSE OBJECTIVES:
To understand the basics of software testing
To learn how to do the testing and planning effectively
To build test cases and execute them
To focus on wide aspects of testing and understanding multiple facets of testing
To get an insight about test automation and the tools used for test automation
1. Develop the test plan for testing an e-commerce web/mobile application (www.amazon.in).
2. Design the test cases for testing the e-commerce application
3. Test the e-commerce application and report the defects in it.
4. Develop the test plan and design the test cases for an inventory control system.
5. Execute the test cases against a client server or desktop application and identify the defects.
6. Test the performance of the e-commerce application.
7. Automate the testing of e-commerce applications using Selenium.
8. Integrate TestNG with the above test automation.
9. Mini Project:
a) Build a data-driven framework using Selenium and TestNG
b) Build Page object Model using Selenium and TestNG
c) Build BDD framework with Selenium, TestNG and Cucumber
COURSE OUTCOMES:
CO1: Understand the basic concepts of software testing and the need for software testing
CO2: Design Test planning and different activities involved in test planning
CO3: Design effective test cases that can uncover critical defects in the application
CO4: Carry out advanced types of testing
CO5: Automate the software testing using Selenium and TestNG
TOTAL : 30 PERIODS
5
TEXTBOOKS
6
List of Experiments
PREFACE of JAVA
1 Develop the test plan for testing an e-commerce web/mobile application (www.amazon.in).
4 Develop the test plan and design the test cases for an inventory control system.
5 Execute the test cases against a client server or desktop application and identify the defects.
Mini Project
a) Build a data-driven framework using Selenium and TestNG
7
PREFACE of JAVA
Java is a high-level programming language originally developed by Sun Micro systems. Java runs on
a variety of platforms, such as Windows, Mac OS, and the various versions of UNIX. Java
programming were "Simple, Robust, Portable, Platform-independent, Secured, High Performance,
Multithreaded, Architecture Neutral, Object-Oriented, Interpreted, and Dynamic".
JAVA PLATFORMS
It is a Java programming platform. It includes Java programming APIs such as java.lang, java.io,
java.net, java.util, java.sql, java.math etc. It includes core topics like OOPs, String, Regex, Exception,
Inner classes, Multithreading, I/O Stream, Networking, AWT, Swing, Reflection, Collection, etc.
It is an enterprise platform that is mainly used to develop web and enterprise applications. It is built
on top of the Java SE platform. It includes topics like Servlet, JSP, Web Services, EJB, JPA, etc.
4) JavaFX
It is used to develop rich internet applications. It uses a lightweight user interface API.
To set the temporary path of JDK, you need to follow the following steps:
For Example:
set path=C:\Program Files\Java\jdk1.6.0_23\bin
to simulate bunch of protocols like TCP, FTP, UDP, https and DSR.It simulates wired and wireless
network. It is primarily Unix based. Uses TCL as its scripting language.
8
INDEX PAGE
9
CCS366 SOFTWARE TESTING AND AUTOMATION LABORATORY
Aim:
The aim of this experiment is to develop a comprehensive test plan for testing the functionality and
usability of the e-commerce web/mobile application www.amazon.in.
Algorithm:
1. Identify the Scope: Determine the scope of testing, including the features and functionalities that
need to be tested.
2. Define Test Objectives: Specify the primary objectives of testing, such as functional
testing,usability testing, performance testing, security testing, etc.
3. Identify Test Environment: Define the platforms, browsers, devices, and operating systems on
which the application will be tested.
4. Determine Test Deliverables : Decide on the documents and artifacts that will be generated during
the testing process, such as test cases, test reports, and defect logs.
5. Create Test Strategy: Develop an overall approach for testing, including the testing
techniques,entry and exit criteria, and the roles and responsibilities of the testing team.
6. Define Test Scope and Schedule: Specify the timeline for each testing phase and the scope of
testing for each phase.
7. Risk Analysis: Identify potential risks and their impact on the testing process, and devise risk
mitigation strategies.
8. Resource Planning: Allocate the necessary resources, including the testing team, hardware, and
software required for testing.
1
9. Test Case Design: Prepare detailed test cases based on the requirements and functionalities of the
e-commerce application.
10. Test Data Setup: Arrange test data required for executing the test cases effectively.
11. Test Execution: Execute the test cases and record the test results.
12. Defect Reporting: Document any defects encountered during testing and track their resolution.
Test Plan:
The test plan should cover the following sections:
1. Introduction: Briefly describe the purpose of the test plan and provide an overview of the e-
commerce application to be tested.
3. Test Scope: Specify the features and functionalities to be tested and any limitations on testing.
4. Test Environment: Describe the hardware, software, browsers, and devices to be used for testing.
7. Risk Analysis: Identify potential risks and the strategies to mitigate them.
9. Test Case Design: Include a summary of the test cases developed for the application.
10. Test Data Setup: Describe the process of arranging test data for testing.
2
11. Defect Reporting: Explain the procedure for reporting and tracking defects.
Expecte Actual
Proces No. Test Case Steps Description Status dResult Result Commen
s t
The test
1. Review Verify the objective
Test the test test s are
TC002 Objective plan objectives Done well-
s document. . defined.
Test
1. Review Check the environment
Test the test specified sare
TC003 Environmen plan environments Done mentioned.
t document. .
3
Risk
Analysis Risks and
1. Review Ensure mitigation
TC007 the test potential risks Done strategies
plan are identified. are
document. mentioned.
Resources
1. Review Check needed
Resource the test the for testing
TC008 Planning plan required Done are listed.
document. resources
.
1. Review Test cases
and execute Validate the are
Test the test prepared test accurate
TC009 Case cases. cases. Done and
Design functional.
1. Review
the test Verify the Test data is
Test data setup availability available for
TC010 Data process. of test Done testing.
Setup data.
1. Run
the test
cases and Test results
document are recorded
TC011 Test the Execute In and
Executio outcomes. thetest Progres documented
n cases. s .
1. Monitor
defect Verify the Defects are
Defect status and tracking of Not tracked
TC013 Tracking updates. defects. Started until
resolution.
4
Explanation:
The test plan is a crucial document that outlines the entire testing process. It ensures that all aspects
of the e-commerce application are thoroughly tested, and the results are systematically documented.
Result:
Upon completion of the experiment, you will have a well-structured test plan that provides a clear road
map for testing the e-commerce web/mobile application www.amazon.in.
5
Experiment 2: Design the Test Cases for Testing the E-commerce Application
Aim:
The aim of this experiment is to design a set of comprehensive and effective test cases for testing the
e-commerce application www.amazon.in.
Algorithm:
1. Understand Requirements: Familiarize yourself with the functional and non- functional
requirements of the e-commerce application.
2. Identify Test Scenarios: Based on the requirements, identify different test scenarios that coverall
aspects of the application.
3. Write Test Cases: Develop test cases for each identified scenario, including preconditions, steps to
be executed, and expected outcomes.
4. Cover Edge Cases: Ensure that the test cases cover edge cases and boundary conditions to verify
the robustness of the application.
5. Prioritize Test Cases: Prioritize the test cases based on their criticality and relevance to the
application.
6. Review Test Cases: Conduct a peer review of the test cases to ensure their accuracy and
completeness.
7. Optimize Test Cases: Optimize the test cases for reusability and maintainability.
6
3. Test Case Description: Detailed steps to execute the test.
4. Precondition: The necessary conditions that must be satisfied before executing the test case.
Expecte Actual
Proces No. Test Case Steps Description Status dResult Result Comment
s
1.
Test Navigate Verify User can
Case User to the user successfull
Design TC001 Registratio registratio registratio Done yregister.
n n page. nprocess.
Verify Product is
1. adding added to
Browse products the
TC004 Add to Cart the to the Done shopping
product cart. cart.
catalog.
1. Click Items in
Shopping on the Verify the the
Cart shopping shopping shopping
TC005 Validation cart icon. cart Done cart are
contents. displayed.
7
1. Click on Checkout
the Verify process
Checkou "Checkout the Not proceeds
TC006 tProcess "button. checkout Started as
process. expected.
Explanation:
Test cases are designed to validate the functionality and behaviour of the e-commerce application.
They ensure that the application performs as intended and meets the specified requirements.
Result:
Upon completion of the experiment, you will have a set of well-defined test cases ready for testing
the e-commerce application www.amazon.in.
8
Experiment 3: Test the E-commerce Application and Report the Defects in It
Aim:
The aim of this experiment is to execute the designed test cases and identify defects or issues in the
e-commerce application www.amazon.in.
Algorithm:
1. Test Environment Setup: Set up the testing environment with the required hardware, software,and
test data.
2. Test Case Execution: Execute the test cases designed in Experiment 2, following the specified
steps.
3. Defect Identification: During test execution, record any discrepancies or issues encountered.
4. Defect Reporting: Log the identified defects with detailed information, including steps to
reproduce, severity, and priority.
5. Defect Tracking: Track the progress of defect resolution and verify fixes as they are implemented.
6. Retesting: After defect fixes, retest the affected areas to ensure the issues are resolved.
7. Regression Testing: Conduct regression testing to ensure new changes do not introduce
newdefects.
9
Test Case Table:
Expected Actual
Proces No. Test Case Steps Descriptio Status Result Result Comm
s n ent
1.
Test Navigate to Verify User can
Case User the user successfull
Design TC001 Registrati registration registrati Done yregister.
on page. o
nprocess.
1. Verify User can
Navigate to userlogin successfully
TC002 User the login process. Done log in.
Login page.
Search
1. Enter a results
keyword in Verify relevant
Search the search search to the
TC003 Functional bar. functionalit Done keyword.
ity y
.
Verify Product is
1. Browse adding added to
the product products the
TC004 Add to catalog. to the Done shopping
Cart cart. cart.
1. Click on Items in
Shopping the shopping Verify the
Cart cart icon. the shopping
TC005 Validation shopping Done cart are
cart displayed.
contents.
1. Click on Checkout
the Verify process
Checko "Checkout the Not proceeds
TC006 u "button. checkou Started as
tProces t expected.
s process.
10
Explanation:
Testing the e-commerce application aims to validate its functionality and usability. By identifying
and reporting defects, you ensure the application's quality and reliability.
Result:
Upon completion of the experiment, you will have a list of identified defects and their status after
resolution.
11
Experiment 4: Develop the Test Plan and Design the Test Cases for an Inventory Control
System
Aim:
The aim of this experiment is to create a comprehensive test plan and design test cases for an
Inventory Control System.
Algorithm:
Follow the same algorithm as described in Experiment 1 for developing the test plan for an inventory
control system.
Follow the same algorithm as described in Experiment 2 for designing test cases for an inventory
control system.
Test Plan:
Expecte Actual
Proces No. Test Case Steps Description Status dResult Result Commen
s t
The test
1. Review the plan
requirements Verify the includes all
Test Scope and project scope of essential
Plan TC001 of documentation testing. Done features.
Testing .
2. Identify
the modules
to be tested.
3.
Determine
the out-of-
scope items.
12
Expecte Actu
Proces No. Test Case Steps Description Status dResult al Commen
s Resul t
t
1. Review the The test
requirements Verify the objectives
Test and project test are clearly
TC002 Objective documentation objectives Done defined.
s . .
2. Discuss
with
stakeholders
to
understand
expectations.
1. Identify the
hardware and Verify the The test
Test software required Not environment
TC003 Environmen requirements environments Started is defined.
t . .
2. Set up the
required
hardware and
software.
1. Determine All
the Verify the necessary
Test documents required Not document
TC004 Deliverable and artifacts deliverables Started sare listed.
s to be .
produced.
2. Create
templates
fortest
reports,
defect logs,
etc.
13
Expecte Actual
Proces No. Test Case Steps Description Status dResult Result Commen
s t
2.
Determine
the entry
and exit
criteria.
1. Define
Test thetimeline Verify the The
Scopeand for each schedule Not schedule is
TC006 Schedule testing for Started established
phase. testing. .
2. Determine
the scope of
testing for
each phase.
Potential
risks are
1. Identify Verify risk identified
potential analysis with
Risk risks in and Not mitigation
TC007 Analysis Started plans.
the testing mitigation
process. strategies.
2. Discuss risk
mitigation
strategies with
the team.
1. Allocate Resources
the required Verify the needed
Resourc resources for availability of Not for testing
TC008 e testing. resources. Started are
Planning allocated.
2. Determine
the roles and
responsibilities
of the team.
14
Test Case Design:
Expecte Actu
Proces No. Test Case Steps Description Status dResult al Commen
s Resul t
t
1. Review Verify the All
Test Module A - the functionality functionalities
Case Functionalit requirements of Module A.Not of Module A
Design TC001 yTest related to Started are tested.
Module A.
2. Identify
testscenarios
for Module
A.
3. Develop
detailed
test cases
for Module
A.
2. Identify
integration
points with
other
modules.
3. Design test
cases for
testing
integration
scenarios.
15
Expecte Actual
Proces No. Test Case Steps Description Status dResult Result Commen
s t
3. Develop
performanc
etest cases
forModule
C.
1. Review the
Module D usability Verify the Module D is
-Usability requirements usability Not user-
TC004 Test for Module of Started friendly and
D. Module intuitive.
D.
2. Identify
usability
aspects to
be tested.
3. Create test
cases for
evaluating
Module D's
usability.
Module E is
1. Review protected
the security Verify the against
Module E - requirements security Not security
TC005 Security for Module of Module Started threats.
Test E. E.
2.
Identify
potential
16
Expecte Actual
Proces No. Test Case Steps Description Status dResult Result Comment
s
security
vulnerabilities
.
3. Design
security test
cases to
assessModule
E.
Explanation:
An inventory control system is critical for managing stock and supplies. Proper testing ensures the
system functions accurately and efficiently.
Result:
Upon completion of the experiment, you will have a well-structured test plan and a set of test cases
ready for testing the Inventory Control System.
17
Experiment 5: Execute the Test Cases against a Client-Server or Desktop Application and
Identify the Defects
Aim:
The aim of this experiment is to execute the test cases against a client-server or desktop application
and identify defects.
Algorithm:
1. Test Environment Setup: Set up the testing environment, including the client-server or desktop
application, required hardware, and test data.
2. Test Case Execution: Execute the test cases designed in Experiment 2 against the application.
3. Defect Identification: During test execution, record any discrepancies or issues encountered.
4. Defect Reporting: Log the identified defects with detailed information, including steps to
reproduce, severity, and priority.
5. Defect Tracking: Track the progress of defect resolution and verify fixes as they are implemented.
6. Retesting: After defect fixes, retest the affected areas to ensure the issues are resolved.
7. Regression Testing: Conduct regression testing to ensure new changes do not introduce new
defects.
18
Test Case Table:
Expecte Actual
Process No. Test Case Steps Description Status d Result Result Comment
User can
Test Case 1. Launch Verify user Not successfully
Executio TC001 User Login the login Started log in.
n application. process.
2. Enter
validlogin
credentials.
3. Click on the
"Login"
button.
Invalid
datashows
1. Access Verify data appropriate
a data validation Not error
TC002 Data input on the Started messages.
Validation form. form.
2. Enter
invalid data in
the form
fields.
3. Submit the
form.
2. Select a file
from the
system.
3. Click on
the"Upload"
button.
19
Application
Verify the gracefully
Network 1. application' s Not handles
TC004 Connectivit Disconnect response. Started disconnection
y the network. .
2. Attempt
to perform
an action
requiring
network
access.
2. Perform
actions
simultaneously
.
2. Execute
testson various
browsers.
1. Monitor Data is
network Verify correctly
Client-Server trafficbetween communicatio Not transmitted
TC007 Communicatio clientand n integrity. Started and
n server. received.
20
Explanation:
Testing a client-server or desktop application ensures its functionality across different platforms and
environments.
Result:
Upon completion of the experiment, you will have a list of identified defects and their status after
resolution for the client-server or desktop application.
21
Experiment 6: Test the Performance of the E-commerce Application
Aim:
The aim of this experiment is to test the performance of the e-commerce applicationwww.amazon.in.
Algorithm:
1. Identify Performance Metrics: Determine the performance metrics to be measured, such as
response time, throughput, and resource utilization.
2. Define Test Scenarios: Create test scenarios that simulate various user interactions and load the
application.
3. Performance Test Setup: Set up the performance testing environment with appropriate hardware
and software.
4. Execute Performance Tests: Run the performance tests using the defined scenarios and collect
performance data.
5. Analyze Performance Data: Analyze the collected data to identify any performance bottle necks or
issues.
Expecte Actual
Process No. Test Steps Descriptio Stat d Result Com
Case n us Result ment
The home
page loads
1. Access within the
the
home page specified
of
Performan Measure Not
ce Response the e- response
the
Time for commerce response time
Testing TC0 Hom application. time. Start threshold.
01 e ed
Page
22
Expecte d Actual
Process No. Test Case Steps Descri Statu Result Result Comme
ptio n s nt
2. Use a
performance
testing tool to
record the
time.
3. Analyze the
recorded data to
determine
response
time.
2. Execute
performan
cetests during
peak
hours.
3. Analyze the
data to
determine the
throughput.
TC00 Resource 1. Monitor CPU, Meas Resource
3 Utilizatio memory,and urere Not utilization
n network usage sourc Starte remains
during testing. eutili d within
ztion. acceptable
limits.
2. Execute
performan
cetests while
monitoring
resources.
23
Expected Actua l
Process No. Test Case Steps Descriptio Statu Result Resul Com
n s t men t
3. Analyze
the data to
assess
resource
utilization.
The
1. Simulate application
multiple remains
concurrent Measure stable and
Concurre users app Not responsive
TC00 ntUsers accessing the performan Starte under
4 app. ce under d load.
load.
2. Increase
the
number of
concurrent
users
gradually.
3. Record
the
application's
behavior
with
increased
load.
Measur
1. Apply e The system
maximum system recovers
load to test behavio gracefully
Stress the system's runder Not after
TC00 Testing breaking extrem Starte stress is
5 point. e load. d removed.
2. Apply the
maximum
user load the
application
can handle.
3. Observe
the
application's
response
under stress.
24
Expected Actual
Process No. Test Case Steps Description Status Result Result Comment
Performan
ce
bottleneck s
1. Identify are
performanc addressed
e Improve and
Performan bottlenecks application Not application
TC00 ceTuning and areas of performance. Started performs
6 improveme better.
nt.
2. Analyze
the
performanc
e test
results.
3.
Implement
necessary
optimization
s
.
Explanation:
Performance testing helps to identify bottlenecks in the e-commerce application, ensuring it can
handle real-world user loads effectively.
Result:
Upon completion of the experiment, you will have performance test results and any optimizations
made to improve the application's performance.
25
Experiment 7: Automate the testing of e-commerce applications using Selenium.
Aim:
The aim of this task is to automate the testing of an e-commerce web application (www.amazon.in)
using Selenium Web Driver, which will help improve testing efficiency and reliability.
Algorithm:
1. Set up the environment:
- Install Java Development Kit (JDK) and configure the Java environment variables.
- Install an Integrated Development Environment (IDE) like Eclipse or IntelliJ.
- Download Selenium Web Driver and the required web drivers for the browsers you intend to test
(e.g., ChromeDriver, GeckoDriver for Firefox).
26
6. Analyze the test results:
- Review the test execution results to identify any failed test cases.
- Debug and fix any issues with the automation scripts if necessary.
7. Report defects:
Program:
package program;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import
org.openqa.selenium.chrome.ChromeDriver;
public class selenium {
public static void main(String[] args)
{
System.setProperty("webdriver.chrome.driver","C:\\Users\\Admin\\Downloads\\chromedri
ver- win64\\chromedriver-win64\\chromedriver.exe");
WebDriver d=new
ChromeDriver();
d.get("https://www.amazon.in");
d.findElement(By.xpath("//*[@id=\"nav-link-accountList\"]/span/span")).click();
d.findElement(By.id("ap_email")).sendKeys("[email protected]");
d.findElement(By.xpath("//*[@id=\"continue\"]")).click();
d.findElement(By.id("ap_password")).sendKeys("your password");
d.findElement(By.xpath("//*[@id=\"signInSubmit\"]")).click();
String u=d.getCurrentUrl();
if(u.equals("https://www.amazon.in/?ref_=nav_ya_signin"))
{
System.out.println("Test Case Passed");
27
{
Else
{
}
d.close();
}
}
Automation Process:
28
29
Console output:
Result:
The successful completion of this task will yield:
- Automated test scripts for the e-commerce application using Selenium WebDriver.
- Identification of defects, if any, in the application.
30
Experiment 8: Integrate TestNG with the above test automation.
Aim:
The aim of this task is to integrate TestNG with the existing Selenium automation scripts for the
e-commerce application, enhancing test management, parallel execution, and reporting capabilities.
Algorithm:
1. Set up TestNG in the project:
- Create an XML configuration file for TestNG to define test suites, test groups, and
other configurations.
- Review the TestNG-generated test reports to identify any failed test cases.
- Utilize TestNG's reporting capabilities to understand the test execution status.
31
- Provide detailed information about each defect, including steps to reproduce and
expected results.
package mytest;
import java.time.Duration;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.chrome.ChromeDriver;
import org.testng.Assert;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.Test;
WebDriver driver;
@BeforeMethod
System.setProperty("webdriver.chrome.driver","C:\\selenium\\chromedriver_win32\\c
hro
medriver.exe");
driver=new ChromeDriver();
driver.get("https://amazon.in");
32
driver.manage().window().maximize();
driver.manage().timeouts().implicitlyWait(Duration.ofSeconds(5));
}
@Test
{
String actualTitle=driver.getTitle();
String expectedTitle="Online Shopping site in India: Shop Online for Mobiles, Books,
Watches, Shoes and More - Amazon.in";
Assert.assertEquals(actualTitle, expectedTitle);
@Test
boolean flag=driver.findElement(By.xpath("//a[@id='nav-logo-sprites']")).isDisplayed();
Assert.assertTrue(flag);
@AfterMethod
driver.quit();
}
}
Program Code (pom.xml) :
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-
4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>MiniProject2</groupId>
33
<artifactId>MiniProject2</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>4.3.0</version>
</dependency>
</dependencies>
<build>
<sourceDirectory>src</sourceDirectory>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<release>16</release>
</configuration>
</plugin>
</plugins>
</build>
</project>
34
Program Code (testng.xml) :
<suite name="Suite">
<test name="Test">
<classes>
<class name="mytest.Program1"></class>
</classes>
Output:
Result:
The successful completion of this task will yield:
- Integration of TestNG with the existing Selenium automation scripts.
- Enhanced test management and reporting capabilities.
- Identification of defects, if any, in the application and improved efficiency in handling test
scenarios.
35