Analysis of Loops - Module One Cont'd: Asymptotic Analysis Worst, Average and Best Cases Asymptotic Notations
Analysis of Loops - Module One Cont'd: Asymptotic Analysis Worst, Average and Best Cases Asymptotic Notations
We have discussed Asymptotic Analysis, Worst, Average and Best Cases and Asymptotic
Notations.
A loop or recursion that runs a constant number of times is also considered as O(1).
For example the following loop is O(1).
// Here c is a constant
for (int i = 1; i <= c; i++) {
// some O(1) expressions
}
2) O(n): Time Complexity of a loop is considered as O(n) if the loop variables is incremented /
decremented by a constant amount.
3) O(nc): Time complexity of nested loops is equal to the number of times the innermost
statement is executed. For example the following sample loops have O(n2) time
complexity
For example Selection sort and Insertion Sort have O(n2) time complexity.
For example Binary Search(refer iterative implementation) has O(Logn) time complexity.
4) For a loop, the running time is the sum of the times to execute the body of the loop and
the time to evaluate the condition for termination (which is O(1)). Always the time is the
product of the number of times around the loop and the possible largest time for an
execution of the body, though each loop and has to be considered separately.
5) For a program that has procedures/functions which are not recursive, we first compute the
running times of the called procedures that call no other procedures and then compute the
running times of the procedures/functions that call those procedures/functions using the
running times of the called procedures. If we have recursive procedures, we need to
associate an unknown time function T(n) with each recursive procedure where n is the
size of the arguments. We will then get a recurrence T(n) which is an equation for T(n) in
terms of T(k) for the different values of k.
n
Sn = (a + l )
2
n −1 n −1
= (n − 1 + 1) = ( n)
2 2
= O(n2)
The following shows the running times of the sorting algorithms considered:
Insertion Sort: O(n2)
Selection Sort: O(n2)
Shell Sort: O(n1. 5) , proved by Donald Knuth
Quicksort: O(n2)
Heapsort: O(nlogn)
Mergesort: O(nlogn)
How to calculate time complexity of recursive functions?
Time complexity of a recursive function can be written as a mathematical recurrence relation. To
calculate time complexity, we must know how to solve recurrences. We will soon be discussing
recurrence solving techniques in the next module(s).