0% found this document useful (0 votes)
116 views

Algorithm Complexity Analysis

This document discusses algorithm complexity analysis and asymptotic notation. It explains that time complexity analysis is used to determine the time requirement of an algorithm, while space complexity analysis determines memory requirements. Asymptotic notation like Big-O notation is used because determining the exact step count of a program is difficult. Big-O notation provides an upper bound for the growth rate of a function. Common time complexities include constant (O(1)), linear (O(n)), quadratic (O(n^2)), cubic (O(n^3)), and exponential (O(2^n)). Examples are provided to demonstrate calculating the time complexity of different algorithms using Big-O notation.
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
116 views

Algorithm Complexity Analysis

This document discusses algorithm complexity analysis and asymptotic notation. It explains that time complexity analysis is used to determine the time requirement of an algorithm, while space complexity analysis determines memory requirements. Asymptotic notation like Big-O notation is used because determining the exact step count of a program is difficult. Big-O notation provides an upper bound for the growth rate of a function. Common time complexities include constant (O(1)), linear (O(n)), quadratic (O(n^2)), cubic (O(n^3)), and exponential (O(2^n)). Examples are provided to demonstrate calculating the time complexity of different algorithms using Big-O notation.
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 18

ALGORITHM

COMPLEXITY
ANALYSIS
Time Complexity (Contd)
 Determining an estimate of the time and
memory requirement of the algorithm.
 Time estimation is called time complexity
analysis
 Memory size estimation is called space
complexity analysis.
 Because memory is cheap and abundant, we
rarely do space complexity analysis
 Since time is “expensive” , analysis now
defaults to time complexity analysis
Asymptotic Notation
 Determining exact step count of a program is a very
difficult task
 This is why , step count is not used for comparing
programs.
 We determine the rate of growth of a function
 If , we have two programs with the complexity of c1n2+c2n
and c3n.
 Then c3n will always be faster for sufficiently large values
of n.
 If c1=1 , c2=2 and c3=100 , then
c1n2+c2n <= c3n for n<=98
c1n2+c2n > c3n for n>98
 There will always be the value of n beyond which , the
program with complexity c3n will always be faster than
program with complexity c1n2+c2n.
 This value of n will be called the break-even point.
 If the break even point is zero , than program with complexity
c3n will always be faster.
 The exact break even point can not be determined
analytically.
 The programs have to be run on a computer in order to
determine the break even point.
BIG O NOTATION
 Definition
“f(n)=O(g(n)) read as f(n) is of order g(n) iff there
exist positive constants c and M such that
f(n)<=Mg(n) for all , n>=c.”
 The statement f(n)=O(g(n)) merely states that

function g(n) is upper bound on the value of f(n)


for all n, n>=c. Or
 f(n) grows with the same rate of slower than g(n)
 n+5 = Θ(n) = O(n) = O(n2)= O(n3) = O(n5)

 the closest estimation: n+5 = Θ(n)


 the general practice is to use
 the Big-Oh notation:
 n+5 = O(n)
 O(1) is used for constant computing time.
 O(n) is called liner computing time.
 O(n2) is called quadratic computing time.
 O(n3) is called cubic computing time.
 O(2n) is called exponential computing time.
 Other computing times which are most often used
are O( log n) and O( n log n).
 Theorem
 If f(n)=amnm+……………. A1n + ao ; then
f(n)=O(nm).
Example 1:
f(n)=3n+2 , To show f(n) is of O(n)?
Then , we proceed as follows:-
3n<=3n for n>=0
2 <=2n for n>=1,
So for n>=1 ,
f(n)<=3n+2n=5n , i.e f(n)<=Mg(n) for all n,
n>=M
Here , f(n) <= 5n , for n>=1,
Hence f is of O(n).
Example 2
f(n)=10n2+4n+2 , To show f(n) is of O(n2) ?
To find O(n) , we proceed as follows:-
10n2 <= 10n2 , for n >=0
4n <= 4n2 , for n >=1
2 <= 2n2 , for n >=1
for n>=1 , f(n) <=16n2 , Hence f(n)=O(n2).
 We use Big-O Notation to obtain a crude estimate
of the smallest upper bound for worstTime(n) and
averageTime(n).
 Example 3
double sum=0;
for( int i=0; i<1000 ; i++)
sum+=sqrt(i);
In this case , the loop always runs the same number
of time , so worst time is constant , and
independent of n. Hence the worst time is O(1).
Example 6
for (i=0; i<n ; i++)
for (j=0; j< n ; j++ )
{
//One or more statements.
}
Here , worstTime(n) is O(n2).
Example 7
for (i=0 ; i < n ; i++ )
for (j=i ; j < n ; j++)
{
//S One or more statements
}
The number of times S is executed is
n + (n-1) + (n-2)+…….. 2+3+1=n(n+1)/2
Hence , worstTime(n) is O(n2).
 Example 8
for(i=0 ; i< n ; i++)
{//One or more statements }

for (i=0; i<n ; i++)


for (j=0; j< n ; j++ )
{ //One or more statements.}
Here ,for the first loop , the worstTime(n) is O(n) , for
the second loop , the worstTime(n) is O(n2). Hence ,
for the two of them , the worstTime(n) is O(n2).
3. Study the following. Find the worstTime(n) of each of the
following ?
1. For (i=0;i<n;i++)
1. For (j=0;j*j<n;j++)
S
2. for (i=0; sqrt(i)<n;i++)
S
3. k=1
for (i=0 ; i<n; i++)
k*=2;
for (i=0 ; i<k ; i++)
S
Growth of functions

70
60
50 O(2n)
O(n2)
40
O(logn)
30
O(nlogn)
20 O(n)
10
0
1 3 5 7 9

You might also like