CPSC 122 – Data Structures
Washington State University – Department of Computer Science
Efficiency of Algorithms
Analysis Of Algorithms
Tools to see the differences between different algorithms
Looks into specific differences of efficiency of the algorithms
Look at different aspects of efficiency not just how long the code is
The efficient use of both time and memory are the two most important aspects that we
look into
Time Efficiency
How are the algorithms coded?
Compare running time
What computer should you use?
Code efficiency should be independent of the type of computer being used
What data should the programs use?
Different data/use cases might give you different run times
Algorithm's Execution Time
Related to the number of executions it requires
Linked list versus array list
Execution time is expressed in terms of the number n
Number of items the algorithm needs to process
Analysis and Big O Notation
Algorithm A: requires time proportional to f(n) or to be order of f(n)
Order of f(n) - denoted O(f(n))
f(n) - the algorithm's growth function
Big O notation: notation using the letter O to denote order
Example – problem of size n requires time directly proportional to n – problem is O(n)
Big O Notation Definition
Example : n^2-3n+10 O(n^2)
Note: Big O is can be seen as an
overestimation
Algorithm Growth Rate
Algorithms time requirement is dependent on the problem size
Number of nodes in the linked list
Size of array
Number of items in a stack
Reach conclusions such as:
Algorithm A: requires n^2/5 time units to solve a problem of size n
Algorithm B: requires 5xn time units to solve a problem of size n
Algorithm Growth rate
Further estimate time efficiency by stating:
• Algorithm A: requires time proportional to n^2
• Algorithm B: requires time proportional to n
Even though you do not know the exact time it takes for Algo. A and Algo. B to complete
a specific sized problem
You know that Algo. B takes less time than Algo. A for large problems
Growth Rates
Properties of growth-rate functions
Several mathematical properties of Big O notation help to simplify the analysis of an
algorithm.
1. You can ignore low-order terms in an algorithm’s growth-rate function.
§ O(n^3+4n^2+3n) O(n^3)
2. You can ignore a multiplicative constant in the high-order term of an algorithm’s growth-
rate function.
§ O(5n^3) O(n^3)
3. You can combine growth rate functions
§ O(f(n)) + O(g(n)) O(f(n)+g(n))
Example #1
Traversal of Linked List
Example #2
Nested For Loop
Solution
Example #3
For nested do_while
Solution
Worst Case and Average Case
A particular algorithm might require different times to solve different problems of the same
size.
The Worst Case: The maximum amount of time that an algorithm can require to solve a
problem of size n
Shows that the algorithm is never slower than your estimate.
Sequential Search Algorithms
Worst case: O(n)
Average case: O(n/2) = O(n)
Best case: O(1)
Binary search
Big-O Analysis
Unit of work: comparisons
Best case
– target value is at first midpoint
– O(1) comparisons
Worst case
– target value is not found
– list is cut in half until it is reduced to a list of size 0
– How many times can the list be cut in half? The number of times a number
n is divisible by another number m is defined to be the 𝑙𝑜𝑔𝑏 (𝑎), so the answer is
log2(𝑛) = 𝑶(lg 𝒏)
Analysis of Algorithms
Example:
Insertion Sort
Set a marker after the first element
Select the first unsorted element
Swap other elements to the right to
create the correct position for the
unsorted element
Advance to the next unsorted
element