0% found this document useful (0 votes)
13 views

4. ASYMPTOTIC NOTATIONS

The document explains asymptotic notations used to analyze algorithm complexity, including Big Oh (O), Big Theta (Θ), and Big Omega (Ω). It categorizes functions based on their growth rates and provides examples to illustrate how these notations describe upper and lower bounds of algorithm performance. The increasing order of function weightage is also outlined, emphasizing the significance of these notations in algorithm analysis.

Uploaded by

sdabbara
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

4. ASYMPTOTIC NOTATIONS

The document explains asymptotic notations used to analyze algorithm complexity, including Big Oh (O), Big Theta (Θ), and Big Omega (Ω). It categorizes functions based on their growth rates and provides examples to illustrate how these notations describe upper and lower bounds of algorithm performance. The increasing order of function weightage is also outlined, emphasizing the significance of these notations in algorithm analysis.

Uploaded by

sdabbara
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 18

ASYMPTOTIC

NOTATIONS
TYPES OF FUNCTIONS
• O(1) – Constant
• O(log n) – Logarithmic
• O(n) – Linear
• O(n2) – Quadratic
• O(n3) – Cubic
• O(2n) – Exponential

Increasing order of their weightage


1 < log n < √ n < n < n log(n) < n 2 < 2 n < n! < n n
INTRODUCTION
• Asymptotic Notations are languages that allow us to analyze an algorithm’s
running time by identifying its behavior as the input size for the algorithm
increases.

• When it comes to analyzing the complexity of any algorithm in terms of time and
space, we can never provide an exact number to define the time required and the
space required by the algorithm, instead we express it using some standard
notations, also known as Asymptotic Notations.
Asymptotic Notation
• Space and time required by a program is non-negative quantities so the assumption that
f(n) has non negative value for all values of n

• Asymptotic notation describes the behaviour of the time/space complexity for large
instance characteristics.

• f(n) =3 n 2 +6nlogn +7n+5


• Step count of program. We say its asymptotic complexity is O(n 2 )
Let us take an example, if some algorithm has a time complexity of T(n) = (n 2 + 3n
+ 4), which is a quadratic equation. For large values of n, the 3n + 4 part will
become insignificant compared to the n2 part.
TYPES OF ASYMPTOTIC NOTATIONS
We use three types of asymptotic notations to represent the growth of any algorithm,
as input increases:

Big Oh(O)
Big Theta (Θ)
Big Omega (Ω)
Asymptotic Notation
• Upper Bounds: Big-Oh

• This notation is known as the upper bound of the algorithm, or a Worst Case of
an algorithm.

• It tells us that a certain function will never exceed a specified time for any value
of input n.
f(n)= 2n+3
Big Oh(O)
Big-Oh, commonly written as O, is an Asymptotic Notation for the worst case, or
ceiling of growth for a given function.

It provides us with an asymptotic upper bound for the growth rate of the runtime
of an algorithm.

Say f(n) is your algorithm runtime, and g(n) is an arbitrary time complexity you
are trying to relate to your algorithm. f(n) is O(g(n)), if for some real constants c
(c > 0) and n0, f(n) <= c g(n) for every input size n (n > n0).
O-notation
For function g(n), we define O(g(n)), big-O of n,
as the set:

O(g(n)) = {f(n) :
 positive constants c and n0, such that n  n0,

we have 0  f(n)  cg(n) }

g(n) is an asymptotic upper bound for f(n).


EXAMPLE
1. f(n) = 2n + 3
2. f(n) = 3n2 − 100n + 6
Big Oh (O)
Example :

f(n)= 2n+3

f(n) <= c g(n) for all values of n>=n0

(2n+3) <= (2n +3n) or 10n or 7n or ….. n 2


(2n+3) = 5n n>=1

C=5, g(n) =n

Therefore f(n) = O(n)


f(n)= O(n 2 ) but we need to find the closest
function
f(n)= O(log n) is not true
Big-Ω (Big-Omega) notation

• Sometimes, we want to say that an algorithm takes at least a certain amount of


time, without providing an upper bound. We use big-Ω notation; that's the
Greek letter "omega.“

• We say that the running time is "big-Ω of f(n) f(n) f(n)f, left parenthesis, n,
right parenthesis." We use big-Ω notation for asymptotic lower bounds, since
it bounds the growth of the running time from below for large enough input
sizes.
Big-Ω (Big-Omega) notation
Example :
If, f(n)= 2n+3
f(n) >= c g(n) for all values of n>=n0
(2n+3) >= n for all n>=1
(2n+3) = 1*n

C=1, g(n) =n

Therefore f(n) = Ω(n)


f(n)= Ω (log n) but we need to find the closest
function
f(n)= Ω(n^2) is not true
Theta notation
Example :
If, f(n)= 2n+3
C1 g(n)<=f(n)<=C2 g(n) for all values of n>=n0
1*n<= (2n+3) <= 5n for all n>=1
(2n+3) = 1*n

C1=1, g(n) =n C2=5

Therefore f(n) = (n)


f(n)= (log n) is not true
f(n)= (n 2 ) is not true
Examples
If, f(n)= 2 n 2 +3n+4
f(n)<=C2 g(n) for all values of n>=n0
2 n 2 +3n+4<=2 n 2 +3 n 2 +4 n 2
2 n 2 +3n+4<=9 n 2 for all n>=1

C=9, g(n) = n 2
Therefore f(n) = O(n 2 )
1 n 2 <=2 n 2 +3n+4 Ω(n 2 )
1 n 2 <=2 n 2 +3n+4<= 9 n 2 (n 2 )

You might also like