Introduction to Asymptotic
Analysis
CSE 2215 - Lecture 2
Instructor : Fahmid Al Rifat, Lecturer, Dept. of CSE , UIU
[email protected]
1
Today’s Goals
• Discuss Runtime of Programs.
• Compute and Classify growth of functions.
• Analyze complexity classes for algorithms.
2
What is an Algorithm?
An algorithm is a sequence of computational steps that solves a
well-specified computational problem.
• An algorithm is said to be correct if, for every input instance,
it halts with the correct output
• An incorrect algorithm might not halt at all on some input
instances, or it might halt with other than the desired output.
4
What is a Program?
A program is the expression of an algorithm in a
programming language
A set of instructions which the computer will follow to
solve a problem
5
Define a Problem, and Solve It
Problem:
Description of Input-Output relationship
Algorithm:
A sequence of computational steps that transform the input into
the output.
Data Structure:
An organized method of storing and retrieving data.
Our Task:
Given a problem, design a correct and good algorithm that solves
it.
6
Define a Problem, and Solve It
Problem: Input is a sequence of integers stored in an array.
Output the minimum.
INPUT Algorithm
OUTPUT
instance m:= a[1];
for I:=2 to size of input
25, 90, 53, 23, 11, 34 if m > a[i] then 11
m:=a[I];
return m
m, a[i] Data-Structure
7
What do we Analyze?
o Correctness
o Does the input/output relation match algorithm
requirement?
o Amount of work done (complexity)
o Basic operations to do task
o Amount of space used
o Memory used
o Simplicity, clarity
o Verification and implementation.
o Optimality
o Is it impossible to do better?
8
Running Time
Number of primitive steps that are executed
Except for time of executing a function call most statements roughly require the
same amount of time
y=m*x+b
c = 5 / 9 * (t - 32 )
z = f(x) + g(y)
We can be more exact if need to be
9
An Example: Insertion Sort
10
An Example: Insertion Sort
A = {5, 2, 4, 6, 1, 3}
11
An Example: Insertion Sort
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
12
An Example: Insertion Sort
How many times will
InsertionSort(A, n) {
this loop execute?
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
13
Analyzing Insertion Sort
Statement Cost Times
InsertionSort(A, n) {
for i = 2 to n { c1 n
key = A[i] c2 (n – 1)
j = i – 1; c3 (n – 1)
while (j > 0) and (A[j] > key) { c4 T
A[j+1] = A[j] c5 (T – (n – 1))
j=j–1} c6 (T – (n – 1))
A[j+1] = key c7 (n – 1)
}
}
T = t2 + t3 + … + tn, where ti is the number of while expression
evaluations for the ith for loop iteration
14
Analyzing Insertion Sort
T(n) = c1n + c2(n-1) + c3(n-1) + c4T + c5(T - (n-1)) + c6(T - (n-1)) + c7(n-1)
= c8T + c9n + c10
What can T be?
Best case: the array is sorted (inner loop body never executed)
ti = 1 T = n
T(n) = an + b, a linear function of n
Worst case: the array is reverse sorted (inner loop body executed for all
previous elements)
ti = i T = n(n + 1)/2 - 1
T(n) = an2 + bn + c, a quadratic function of n
Average case:
???
15
Asymptotic Performance
We care most about asymptotic performance
• How does the algorithm behave as the problem size gets
very large?
o Running time
o Memory/storage requirements
o Bandwidth/power requirements/logic gates/etc.
16
Asymptotic Analysis
Worst case
o Provides an upper bound on running time
o An absolute guarantee of required resources
Average case
o Provides the expected running time
o Very useful, but treat with care: what is “average”?
o Random (equally likely) inputs
o Real-life inputs
Best case
o Provides a lower bound on running time
17
Upper Bound Notation
We say InsertionSort’s run time is O(n2)
Properly we should say run time is in O(n2)
Read O as “Big-O” (you’ll also hear it as “order”)
In general a function
f(n) is O(g(n)) if there exist positive constants c and n0 such
that 0 f(n) c g(n) for all n n0
Formally
O(g(n)) = { f(n): positive constants c and n0 such that 0
f(n) c g(n) n n0 }
18
Upper Bound Notation
time
c.g(n)
f(n)
n0 n
We say g(n) is an asymptotic upper bound for f(n)
19
Insertion Sort is O(n2)
Proof
The run-time is an2 + bn + c
If any of a, b, and c are less than 0, replace the constant with
its absolute value
an2 + bn + c (a + b + c)n2 + (a + b + c)n + (a + b + c)
3(a + b + c)n2 for n 1
Let c’ = 3(a + b + c) and let n0 = 1. Then
an2 + bn + c c’ n2 for n 1
Thus an2 + bn + c = O(n2).
Question
Is InsertionSort O(n3) ?
Is InsertionSort O(n) ?
20
Lower Bound Notation
We say InsertionSort’s run time is (n)
In general a function
f(n) is (g(n)) if positive constants c and n0 such that
0 cg(n) f(n) n n0
Proof:
Suppose run time is an + b
Assume a and b are positive
an an + b
21
Lower Bound Notation
time
f(n)
c.g(n)
n0 n
We say g(n) is an asymptotic lower bound for f(n)
22
Asymptotic Tight Bound
A function f(n) is (g(n)) if positive constants c1, c2, and n0 such
that
0 c1 g(n) f(n) c2 g(n) n n0
Theorem
f(n) is (g(n)) iff f(n) is both O(g(n)) and (g(n))
Proof:
23
Asymptotic Tight Bound
c2.g(n) f(n)
time
c1.g(n)
n0 n
We say g(n) is an asymptotic tight bound for f(n)
24
Practical Complexity
For large input sizes, constant terms are insignificant
Program A with running time TA(n)= 100n
Program B with running time TB(n)= 2n2
TP(n)
TB(n) = 2n2
TA(n) = 100n
5000
Input Size n
50
25
Practical Complexity
250
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
26
Practical Complexity
500
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
27
Practical Complexity
1000
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n
0
1 3 5 7 9 11 13 15 17 19
28
Practical Complexity
5000
4000
f(n) = n
f(n) = log(n)
3000
f(n) = n log(n)
f(n) = n^2
2000 f(n) = n^3
f(n) = 2^n
1000
0
1 3 5 7 9 11 13 15 17 19
29
Practical Complexity
f(n) = n
10000000 f(n) = log(n)
f(n) = n log(n)
1000000
f(n) = n^2
f(n) = n^3
100000
f(n) = 2^n
10000
1000
100
10
1
1 4 16 64 256 1024 4096 16384 65536
30
Practical Complexity
Function Descriptor Big-Oh
c Constant O( 1 )
log n Logarithmic O( log n )
n Linear O( n )
n log n n log n O( n log n )
n2 Quadratic O( n2 )
n3 Cubic O( n3 )
nk Polynomial O( nk )
2n Exponential O( 2n )
n! Factorial O( n! )
Other Asymptotic Notations
A function f(n) is o(g(n)) if positive constants c and n0
such that
f(n) < c g(n) n n0
A function f(n) is (g(n)) if positive constants c and n0
such that
c g(n) < f(n) n n0
Intuitively,
■ o( ) is like < ■ ( ) is like > ■ ( ) is like =
■ O( ) is like ■ ( ) is like
32
Thank You
33