Unit-I Software Testing Background-1
Unit-I Software Testing Background-1
• The software worked on a few systems likely the ones that the Disney
programmers used to create the game but not on the most common
systems that the general public had to work.
2. Intel Pentium Floating Point Division Bug
• Enter the following equation into your PC's calculator:
• If you get anything else, you have an old Intel Pentium CPU with a floating-point division
bug a software bug burned into a computer chip and reproduced over and over in the
manufacturing process.
• On October 30, 1994, Dr. Thomas R. Nicely of Lynchburg (Virginia) College traced an
unexpected result from one of his experiments to an incorrect answer by a division problem
solved on his Pentium PC.
• He posted his find on the Internet and soon afterward a firestorm erupted as numerous
other people duplicated his problem and found additional situations that resulted in
wrong answers.
Intel Pentium Floating Point Division Bug….
• Fortunately, these cases were rare and resulted in wrong answers only for extremely
math-intensive, scientific, and engineering calculations. Most people would never
encounter them doing their taxes or running their businesses.
• What makes this story notable isn't the bug, but the way Intel handled the situation:
• Their software test engineers had found the problem while performing their own
tests before the chip was released. Intel's management decided that the problem
wasn't severe enough or likely enough to warrant fixing it or even publicizing it.
• Once the bug was found, Intel attempted to diminish its perceived severity through
press releases and public statements.
• When pressured, Intel offered to replace the faulty chips, but only if a user could
prove that he was affected by the bug.
3. NASA Mars Polar Lander
• On December 3, 1999, NASA's Mars Polar Lander disappeared during its
landing attempt on the Martian surface.
• The NASA Mars Polar Lander failed as each module that was designed for the
launch was tested separately and functioned perfectly.
• But the various modules that were designed were never integrated and
tested together.
• The switch installed to check the landing of the mars polar lander was tested
separately and when integrated, the bit changed to 1 by the slight jerk caused
all lander to fail
4. Y2K Bug
• Sometime in the early 1970s a computer programmer let's suppose his name was
Dave was working on a payroll system for his company.
• The computer he was using had very little memory for storage, forcing him to
conserve every last byte he could.
• Dave was proud that he could pack his programs more tightly than any of his peers.
• One method he used was to, such as 73. shorten dates from their 4-digit format,
such as 1973, to a 2-digit format
• Because his payroll program relied heavily on date processing, Dave could save lots
of expensive memory space.
• He briefly considered the problems that might occur when the current year hit 2000
and his program began doing computations on years such as 00 and 01.
Y2K Bug…..
• He knew there would be problems but decided that his program would surely
be replaced or updated in 25 years and his immediate tasks were more
important than planning for something that far out in time.
• After all, he had a deadline to meet. In 1995, Dave's program was still being
used, Dave was retired, and no one knew how to get into the program to
check if it was Y2K compliant, let alone how to fix it.
• It's estimated that several hundred billion dollars were spent, worldwide, to
replace or update computer programs such as Dave's, to fix potential Year 2000
failures.
Bug
Terms for software failure:
• A person makes an Error
• That creates a fault in software
• That can cause a failure in operation
• Error : An error is a human action that produces the incorrect result that
results in a fault.
• Bug : The presence of error at the time of execution of the software.
• Fault : State of software caused by an error.
• Failure : Deviation of the software from its expected result. It is an event.
Terms for software failure…..
• Defect : A defect is an error or a bug, in the application which is created.
• A programmer while designing and building the software can make mistakes or error.
These mistakes or errors mean that there are flaws in the software. These are called
defects.
• As all the words sound the same they are distinguished based on the
severity and the area in which the software failure has occurred.
• When we run a program the error that we get during execution is termed on
the basis of runtime error, compile time error, computational error, and
assignment error.
• The error can be removed by debugging, if not resolved leads to a problem
and if the problem becomes large leads to software failure.
Software Bug
A Formal Definition: A bug can be defined in simple term as any error or mistake that
leads to the failure of the product or software either due to the specification problem or
due to communication problem, regarding what is developed and what had to be
developed.
For example
1. #include<stdio.h>
2. void main()
3. {
4. int i , fact, n;
5. printf(“enter the number “);
6. scanf(“%d”,&n);
7. for(i =1 ;i <=n;i++)
8. fact = fact * i;
9. printf (“the factorial of a number is ”%d”, fact);
10. }
➢ As in line number 4 the fact is not initialized to 1, so it takes garbage value and gives a wrong output, this is an
example of a bug.
➢ If fact is initialized to zero (fact = 0) than the output will be zero as anything multiplied by zero will give the
output as zero. This is a bug which can be removed by initializing fact = 1 during initializing.
Nature of errors :
Categories of Software Errors
Education
Developers does not understand well enough what he or she is doing
Lack of proper education leads to errors in specification,design, coding, and testing
Communication
Developers do not know enough
Information does not reach all stakeholders
Information is lost
Oversight
Omitting to do necessary things
Transcription
Developer knows what to do but simply makes a mistake
Process
Process is not applicable for the actual situation
Process places restrictions that cause errors
Cost of Bugs
Cost of Bugs…..
1. If the error is made and the consequent defect is detected in the requirements
phase then it is relatively cheap to fix it.
2. If an error is made and the consequent defect is found in the design phase or in
construction phase then the design can be corrected and reissued with relatively
little expense.
3. If a defect is introduced in the requirement specification and it is not detected
until acceptance testing or even once the system has been implemented then
it will be much more expensive to fix.
4. All the testing work done-to that point will need to be repeated in order to reach
the confidence level in the software that we require.
5. It is quite often the case that defects detected at a very late stage, depending on
how serious they are, are not corrected because the cost of doing so is too
expensive.
Goal of Software Tester/ Testing
Who is a Software Tester??..
• Software Tester is the one who performs testing and find bugs, if they
exist in the tested application.
Goal of Software Tester/ Testing
• To find defects before they cause a production system to fail.
• To find bug and to find them at the initial stages of the software development.
• To compile a record of software errors for use in error prevention (by corrective
and preventive actions)
Traits and Skills Required for Good Software Tester
• Communication skills – Team Player, Understand Business Needs
• Domain knowledge – Several Testing Techniques
• Desire to learn – IT is evolving every day
• Technical skills – Programming Skills
• Analytical skills – interpret, organize and prioritize activities
• Curiosity - Explorer
• Think from users perspective
• Be a good judge of your product
• Good at planning
SOFTWARE / SYSTEMS DEVELOPMENT LIFE CYCLE (SDLC)
• Software/Systems development life cycle (SDLC) - a structured step-by-
step approach for developing a software product or information
systems
• Logical steps taken to develop a software product.
• Waterfall methodology
• Rapid application development (RAD)
• Extreme programming (XP)
Waterfall Methodology
• Waterfall methodology - a sequential, activity-based process in
which each phase in the SDLC is performed sequentially from
planning through implementation
Rapid Application Development (RAD)
• Rapid application development (RAD) (also called rapid prototyping)
- emphasizes extensive user involvement in the rapid and
evolutionary construction of working prototypes of a system to
accelerate the systems development process
Extreme Programming (XP)
• Extreme programming (XP) - breaks a project into tiny phases and
developers cannot continue on to the next phase until the first phase is
complete
Testing Axioms
Testing axioms are the rules of software testing and the knowledge that helps put
some aspect of the overall process into perspective.
1. It’s Impossible to Test a Program Completely -Number of possible inputs/Outputs
-Software specification is subjective/ how one interprets
2. Software Testing Is a Risk-Based Exercise
4. The More Bugs You Find, the More Bugs There Are
• Test Case
A test case is a specific procedure of testing a particular requirement.
It will include:
• Identification of specific requirement to be tested
• Test case success/failure criteria
• Specific steps to execute test
• Test Data
• Test conditions
Test Cases…
ENTRY CRITERIA
Entry Criteria for testing is defined as “Specific conditions or on-going activities that must be present before
a process can begin”. In the Systems Development Life Cycle it also specifies which entry criteria are required at
each phase.
By referencing the Entry Exit Criteria matrix, we get the clarity of the deliverables expected from each
phase. The matrix should contain “date required” and should be modified to meet the specific goals and
requirements of each test effort based on size and complexity.
Test Cases…
EXIT CRITERIA
Exit Criteria is often viewed as a single document concluding the end of a life cycle phase. Exit Criteria is
defined as “The specific conditions or on-going activities that must be present before a life cycle phase can
be considered complete. The life cycle specifies which exit criteria are required at each phase”. This
definition identifies the intermediate deliverables, and allows us to track them as independent events
By identifying the specific Exit criteria, we are able to identify and plan how these steps and processes fit
into the life cycle. All of the Exit Criteria listed above, less the Test Summary/Findings Report; act as Entry
Criteria to alter process.
• Execute Test Cases (Manual/Automated Testing)
• Report Defects –Log defects , Fill Traceability Metrics
• Regression Testing – Rerun test for code change
• Analysis
• Summary Report(Closure)
Software Testing Terms
5. V-Model
1. Precision and Accuracy
Accuracy :
• Accuracy is defined as ‘the degree to which the result of a
measurement conforms to the correct value or a standard and
essentially refers to how close a measurement is to its agreed value.
Precision:
• Precision is defined as ‘the quality of being exact’ and refers to how
close two or more measurements are to each other, regardless of
whether those measurements are accurate or not.
• It is possible for precision measurements to not be accurate.
Precision and Accuracy ….
• Both accuracy and precision reflect how close a measurement is to an
actual value, but they are not the same.
• Accuracy reflects how close a measurement is to a known or accepted
value, while precision reflects how reproducible measurements are,
even if they are far from the accepted value. Measurements that are both
precise and accurate are repeatable and very close to true values.
• Example:
Hitting a darts board
2. Verification and Validation
Verification :
• Definition : The process of evaluating software to determine whether the
products of a given development phase satisfy the conditions imposed at the
start of that phase.
• Verification is a static practice of verifying documents, design, code and
program. It includes all the activities associated with producing high quality
software: inspection, design analysis and specification analysis. It is a
relatively objective process.
• Verification will help to determine whether the software is of high quality, but
it will not ensure that the system is useful. Verification is concerned with
whether the system is well-engineered and error-free.
• Methods of Verification : Static Testing
• Walkthrough
• Inspection
• Review
Verification and Validation…..
Validation:
• Definition: The process of evaluating software during or at the end of the
development process to determine whether it satisfies specified requirements.
• Validation is the process of evaluating the final product to check whether the
software meets the customer expectations and requirements. It is a dynamic
mechanism of validating and testing the actual product.
• Methods of Validation : Dynamic Testing
• Testing
• End Users
Verification Validation
1. Verification is a static practice of verifying documents, 1. Validation is a dynamic mechanism of validating and testing
design, code and program. the actual product.
2. It does not involve executing the code. 2. It always involves executing the code.
3. It is human based checking of documents and files. 3. It is computer based execution of program.
4. Verification uses methods like inspections, reviews, 4. Validation uses methods like black box (functional) testing,
walkthroughs, and Desk-checking etc. and white box (structural) testing etc.
5. Verification is to check whether the software conforms to 5. Validation is to check whether software meets the customer
specifications. expectations and requirements.
6. It can catch errors that validation cannot catch. It is low 6. It can catch errors that verification cannot catch. It is High
level exercise. Level Exercise.
7. Target is requirements specification, application and
7. Target is actual product-a unit, a module, a bent of
software architecture, high level, complete design, and
integrated modules, and effective final product.
database design etc.
8. Verification is done by QA team to ensure that the 8. Validation is carried out with the involvement of testing
software is as per the specifications in the SRS document. team.
Components of Reliability
• Probability: the likelihood of mission success
• Intended function: for example, to light, cut, rotate, or
heat
• Satisfactory: perform according to a specification, with
an acceptable degree of compliance
• Specific period of time: minutes, days, months, or
number of cycles
• Availability to perform a function in Specified conditions
Quality and Reliability…..
• Reliability has sometimes been classified as "how quality changes over
time."
• The difference between quality and reliability is that quality shows how
well an object performs its proper function, while reliability shows how
well this object maintains its original level of quality over time, through
various conditions.
• Asking a few key questions can help one determine the difference between
both quality and reliability:
• Quality = Does the object perform its intended function? If so, how well
does it perform its intended function?
• Reliability = To what level has said object maintained this level of quality
over time?
4. Quality Control and Quality Assurance
Quality Control:
Quality Control (QC) is a set of activities for enhancing quality in the
products. The activities focus on identifying defects in actual
products produced.
● Quality control focuses on operational technique and activities used
to fulfill and verify requirement of quality.
● Product oriented and focuses on defect identification.
● Goal of QC is to identify defects after product developed and before
it’s released.
● Responsibility on Specific team that test product for defect.
● Example: QC review, inspections, Testing.
Quality Control and Quality Assurance…..
Quality Assurance (QA):
Quality Assurance (QA) is a set of activities for enhancing quality in the
processes by which product are developed.
● Process oriented and focuses on defect Prevention.
● Goal of Quality Assurance is to improve development and test
process so that defects do not arise when product is being
developed.
● Responsibility on Everyone involved in the Development of product.
● If the data provided through quality assurance identify problems, then
it is management’s responsibility to address the problems
● Example: QA audit, Process documentations, developing checklists,
5. V-Model
• V-Model also referred to as the Verification and Validation Model. In this, each
phase of SDLC must complete before the next phase starts. It follows a sequential
design process same as the waterfall model. Testing of the device is planned in
parallel with a corresponding stage of development.