01 - IT623 Algorithms & Data Structures-Aymptotic Notation
01 - IT623 Algorithms & Data Structures-Aymptotic Notation
Structures
Asymptotic Notation Q, O, W
Analysis of Algorithm for Time and Space
• To analyze an algorithm means:
• developing a formula for predicting how fast an algorithm is, based on the
size of the input (time complexity), and/or
• developing a formula for predicting how much memory an algorithm requires,
based on the size of the input (space complexity)
• Usually time is our biggest concern
• Most algorithms require a fixed amount of space and/or
• Memory is not as expensive as time
Problem Size Matters in Calculating Time or
Space Complexity
• Time and Space complexity will depend on the Problem Size, why?
• If we are searching an array, the “size” of the input array could decide who
long it will take to go through the entire array
• If we are merging two arrays, the “size” could be the sum of the two array
sizes
• If we are computing the nth Fibonacci number, or the nth factorial, the “size” is
n
• We choose the “size” to be a parameter that determines the actual
time (or space) required
• It is usually obvious what this parameter is
• Sometimes we need two or more parameters
Characteristic Operation
• In computing time complexity, one good approach is to count
characteristic operations
• What a “characteristic operation” is depends on the particular problem
• If searching, it might be comparing two values
• If sorting an array, it might be:
• comparing two values
• swapping the contents of two array locations
• both of the above
• Sometimes we just look at how many times the innermost loop is executed
How many times innermost loop will execute?
Algorithm Array_Sum(A): Algorithm Array2D_Sum(A):
for i := 0 to n-1 do for i := 0 to n-1 do
sum := sum + A[i][j] for j := 0 to n-1 do
return sum; sum := sum + A[i][j]
return sum;
Sequential Search Analysis
Input: An array A storing n items, target x
Output: true if x is in A, false otherwise
Algorithm Sequential_Search(A, x):
for i := 0 to n-1 do
if (A[i] = x)
return true;
return false;
• With n = 50000
• f1(50000) = (2,500,000,000) and
• f2(50000) = (2,500,000,000 + 50000 + 10) = 2,500,050,010 ~ 2,500,000,000
• Difference of 0.002%
What is the time complexity ?
Algorithm Dummy(A): Algorithm Dummy1(A):
for i := 1 to n step 2*i do for i := n to 1 step –i/2 do
// do something // do something
What is the time complexity?
Algorithm Algo1(): Algorithm Algo2():
for i := 0 to n-1 do for i := 0 to n-1 do
// do something for j := 1 to n step j*2 do
sum := sum + A[i][j]
for j := 0 to m-1 do
return sum;
// do something
Big O notation – Simplifying f(n)
• Throwing out the constants is one of two things we do in analysis of
algorithms
• By throwing out constants, we simplify 12n2 + 35 to just n2
• Our timing formula is a polynomial, and may have terms of various
orders (constant, linear, quadratic, cubic, etc.)
• We usually discard all but the highest-order term
• We simplify n2 + 3n + 5 to just n2
• We call this a Big O notation
Asymptotic Notation
f(n) O(g(n))
Usually written as
f(n) = O(g(n)) or
f(n) is O(g(n))
Can we Justify Asymptotic Notation?
• Consider f(n) = n2 + 3n + 5 as n varies:
n=0 n2 = 0 3n = 0 f(n) = 5
n = 10 n2 = 100 3n = 30 f(n) = 135
n = 100 n2 = 10000 3n = 300 f(n) = 10,305
n = 1000 n2 = 1000000 3n = 3000 f(n) = 1,003,005
n = 10,000 n2 = 108 3n = 3*104 f(n) = 100,030,005
n = 100,000 n2 = 1010 3n = 3*105 f(n) = 10,000,300,005
f(n) = x2 + 3x + 5, for x=1..10
Common time complexities
BETTER • O(1) constant time
• O(log n) log time
• O(n) linear time
• O(n log n) log linear time
• O(n2) quadratic time
• O(n3) cubic time
• O(nk) polynomial time
• O(2n) exponential time
WORSE
Common Time
Complexities
Big O Visual - O(g(n)) is the set of functions
with smaller or same order of growth as g(n)
O(NlogN) O(N2)
O(N) 20NlogN+3N+100 5N2 +3N+100
3N+100 NlogN+100N 15N2 +100
100N 100NlogN-0.5N-20 N2 + 100NlogN-20
0.5N-20
O(N3)
O(1) 5N3 +3N+100
100 23N3 + 5N2 +100N
5 N3 + 100NlogN-20
0.5
Algorithm Examples for Time Complexity
Big O Notation Name Example(s)
# Odd or Even number,
O(1) Constant
# Swapping two numbers
# Finding element on sorted array
O(log n) Logarithmic
with binary search
# Find max element in unsorted
array,
O(n) Linear
# Duplicate elements in array with
Hash Map
Value of function →
everywhere.
• But it is less than 31n 30n+8
everywhere to the right of n=8. n
O(n)
n ≥ n0=8 →
Increasing n →
No Uniqueness
• There is no unique set of values for n0 and c in proving the asymptotic bounds
• ½ n2 - ½ n ≥ ½ n2 - ½ n * ½ n ( n ≥ 2 ) = ¼ n2 c1= ¼
• n ≠ Q(n2): c1 n2 ≤ n ≤ c2 n2
c2 ≥ n/logn, n ≥ n0 – impossible
Asymptotic Notation Sets