Algortihm Short Notes
Algortihm Short Notes
GATE फर्रे
Module 1: Analysis of Algorithm ∀ n ≥ n0 such that there exists three positive constant
c1 > 0, c2 > 0 and n0 ≥ 1
Aim : The goal of analysis of algorithms is to
compare algorithms mainly in terms of running time O-Notation [Pronounced “big-oh”]
but also in terms of other factors like memory, Let f(n) and g(n) be two positive functions
developer effort. f(n) = O(g(n)), if and only if
f(n) ≤ c . g(n), ∃ n, ≥ n0
Need for Analysis (Why to analyze || What to such that $ two positive constants c > 0, n0 ≥ 1.
analyze || How to analyze)
Ω-Notation: [Pronounced “big-omega”]
Ω notation provides an asymptotic lower bound for a
1. To determine resource consumption
given function g(n), denoted by Ω(g(n)). The set of
<resource such that
functions f(n) = Ω(g(n)) if and only if f(n) ≥ c . g(n), ∀ n ≥
space+time+cost+register>
n0 such that two positive constants c > 0, n0 ≥ 1.
Resources may differ from domain to
domain. Analogy between real no & asymptotic notation
: Let a, b are two real no & f, g two positive
2. Performance comparison to find out efficient functions
solution If f(n) is O(g(n)) : a ≤ b
● If f(n) is Ω(g(n)) : a ≥ b
● If f(n) is Θ(g(n)) : a = b
● If f(n) is o(g(n)) : a < b
● If f(n) is ω(g(n)) : a > b
Methodology of algorithm
● Depends on language
● Operating system
● Hardware (CPU, processor, memory,
Input/output)
Types of analysis Rate of growth of function
1. Aposteriori analysis(platform dependent) : It gives
[Highest] —> n! —> 4n —> 2n —> n² —>
exact value in real units.
nlogn —>log(n!) —> n —> 2logn —> log² n —>
2. Apriori analysis(platform independent) : It allows log logn —> 1 [lowest]
us to calculate the relative efficient performance of
two algorithms in a way such that it is platform Trichotomy property:
independent. It will not give real values in units. For any two real numbers (a, b) there must be a
relation between them
Asymptotic Notations (a > b, a < b, a = b)
θ-Notation
Let f(n) and g(n) be two positive functions
f(n) = θ(g(n)) if and only if
f(n) ≤ c1 . g(n) and f(n) ≥ c2 . g(n)
Page No:- 01
ALGORITHMS
GATE फर्रे
Page No:- 02
ALGORITHMS
GATE फर्रे
Let f(n) is a positive function and T (n) is defined Recurrence relation Time
recurrence relation: complexity
T(n) = aT(n/b) + f(n) T (n) = C; n = 2 O(logn)
Where a >= 1 and b > 1 are two positive constants. T (n) = 2 T(√𝑛𝑛) + C; n > 2
Case 1: T (n) = C; n = 2 O(n)
If f(n) = O(n(logb a−∈)) for some constant ∈ > o then T(n) T (n) = T(n – 1) + C ; n > 2
= θ (n(logb a) ) T (n) = C; n = 1 O(n^2)
Case 2: T (n) = T(n – 1) + n + C ; n > 2
If f(n) = θ (n(logb a) ), then T(n) = θ (n(logb a) *log n) T(n) = C ; n = 1 O(n^n)
Case 3: T(n) = T(n-1) * n + C ; n > 2
log a+
If f(n) = Ω (n( b ∈)) for some constant ∈> 0, and if a T(n) = C ; n = 1 O(n)
T(n) = 2T(n/2) + C ; n > 1
f (n/b) ≤ cf(n) for some constant c < 1 and all
T(n) = C ; n = 1 O(nlogn)
sufficiently large n, then T(n) = θ(f(n))
T(n) = 2T(n/2) + C ; n > 2
Master theorem for subtract and conquer T(n) = C ; n = 1 O(logn)
recurrence: T(n) = T(n/2) + C ; n > 1
Let T(n) be a function defined on possible n: T (n) = 1; n = 2 Θ(loglogn)
T(n) = aT(n-b) + f(n), if n > 1 T(n) = T( n ) + C; n > 2
T(n) = C, if n <= 1 T(n) = T(n/2) + 2^n if n > 1 O(2^n)
For some constant C, a>0, b>0, and f(n) = O(nd)
1. T(n) = O(nd) , if a < 1 Analogy between real no & asymptotic notation
2. T(n) = O(nd+1) , if a = 1 Let a, b are two real no & f, g two positive functions
3. T(n) = O(nd * a(n/b)) , if a > 1 ● If f(n) is O(g(n)) : a ≤ b (f grows slower than
some multiple of g)
● If f(n) is Ω(g(n)) : a ≥ b (f grows faster than
some multiple of g)
● If f(n) is Θ(g(n)) : a = b (f grows at same rate of
g)
● If f(n) is o(g(n)) : a < b (f grows slower than any
multiple of g)
● If f(n) is ω(g(n)) : a > b (f grows faster than any
multiple of g)
Analysis
1. f(n) = n!
n! <= c*nn : n >= 2
n! = O(nn) with c = 1, n0 = 2.
using stirling’s approximation : n! ≈ √(2nπ) nn *
e-n
Page No:- 03
ALGORITHMS
GATE फर्रे
Reflexive ✓ ✓ ✓ × ×
3. Transitive
f(n) = Θ(g(n)) & g(n) = O(h(n))
Symmetric × × ✓ × × f(n) = O(h(n))
Note : Ω and Θ also satisfy transitivity
Transitive ✓ ✓ ✓ ✓ ✓
Trichotomy property:
Page No:- 04
ALGORITHMS
GATE फर्रे
Page No:- 05
ALGORITHMS
GATE फर्रे
I Symmetric form
T(n) = aT(n/b) + g(n)
a : number of sub-problem
b : size shrink factor(each sub-problem n/b)
g(n) : cost to divide and combine
eg. Merge sort : T(n)=2⋅T(2n)+O(n)
II Asymmetric form 1
T(n) = T(αn) + T((1-α)n) + g(n) provide that : 0 < α <
1
eg: T(n) = T(n/3) + T(2n/3) + g(n)
● Algorithm DAC(A, 1, n)
● if(small(1, n)
● return (S(A, 1, n);
● Else
● m ←– Divide(1, n)
● S1 ←– DAC(A, 1, m)
● S2 ←– DAC(A, m+1, n)
● Combine (S1, S2);
Page No:- 06
ALGORITHMS
GATE फर्रे
1. Finding minimum and maximum I. Using DAC: T(n) = 8T(n/2)+O(n^2), for n>1
Note: Provided that list of elements already sorted. 8. Counting Number of Inversion (An inversion in an
T(n) = c : n = 1 array is a pair(i, j) such that i<j and arr[i] > arr[j])
T(n) = a + T(n/2) : n > 1 Time complexity : O(nlogn)
Time complexity : T(n) = O(logn) Space complexity : O(n)(due to merges)
Space complexity : T(n) = O(1)
9. Closest pair of points (Find the minimum Euclidean
4. Merge Sort Algorithm distance between any two points in a 2D plane.)
Recurrence relation => T(n) = 2T(n/2) + O(n)
Comparison based sorting Time complexity = O(nlogn)
Stable sorting but outplace Sorting copies (x-sorted, y-sorted): O(n)
Recurrence relation: T(n) = c if n = 1 Auxiliary space (recursion stack): O(log n)
T(n) = T(n/2) + T(n/2) + cn if n > 1 Space complexity : O(logn)
Time complexity = O(n log n) 10. Convex hull (Find smallest convex polygon that
= Ω(n log n) encloses a point in a 2D plane)
= Θ(n log n) T(n) = 2T(n/2) + O(n)
Space complexity : O(n + logn) = O(n) Time complexity = O(nlogn)
Space complexity : O(logn)
5. Quick Sort Algorithm
Best Case / Average Case Note: In GATE exam if merge sort given then
T(n) = 1 ; if n = 1. always consider outplace.
T(n) = 2T(n/2) + n + C, if n>1
Time complexity : O(n logn) • If array size is very large, merge sort preferable.
Worst case : T(n) = n + T(n-1) + C ; if n > 1 • If array size is very small, then prefer insertion sort.
Note: Quick sort behaves in worst case when • Merge sort is a stable sorting technique.
element are already sorted
Time complexity : O(n^2)
Page No:- 07
ALGORITHMS
GATE फर्रे
Karatsuba optimization :
T(n) = 3T(n/2) + bn ; if n > 1
Time complexity : O(n^1.58)
Karatsuba is better but still not fast enough
Page No:- 08
ALGORITHMS
GATE फर्रे
3. Fractional Knapsack
Page No:- 09
ALGORITHMS
GATE फर्रे
● For each activity, check if its start time is greater ● Add the selected edge and vertex to the MST.
than or equal to the finish time of the last ● More efficient for dense graph
selected activity.
● If the condition holds, select the activity and Time complexity :
update the last selected finish time.
Adjacency matrix + linear search = O(V^2)
Time complexity Adjacency list + binary heap = O(E log V)
Adjacency list + Fibonacci heap = O(E + log V)
If activities are not sorted by finish time:
● Sorting takes O(nlogn) 6. Single source shortest path algorithm
● Selecting activities takes O(n)O(n)O(n)
● Total time = O(nlogn) I. Dijkstra’s algorithm
If activities are sorted by finish time: Using min heap & adjacency list = O(E + V)logV
● Only the selection loop runs → O(n) Using adjacency Matrix & min heap = O(V^2 * E *
● Total time = O(n) logV)
Using adjacency list & Unsorted array = O(V^2)
5. Minimum cost spanning tree Using adjacency list & sorted Doubly linked list =
O(EV)
I. Kruskal’s Minimum spanning tree algorithm
It builds the Minimum Spanning Tree by always II. Bellman Ford algorithm
choosing the next lightest edge that doesn't form It finds the shortest path from source to every
a cycle. vertex, if the graph doesn’t contain a negative
Sort all edges of the graph in non-decreasing weight edge cycle.
order of their weights. If a graph contains a negative weight cycle, it does
Initialize an empty set for the MST. not compute the shortest path form source to all
For each edge in the sorted list: other vertices but it will report saying “negative
● If the edge does not form a cycle with the MST weight cycle exists”.
formed so far, include it in the MST. Otherwise It finds shortest path from source to every vertex,
discard the edge. Input : A weighted, directed graph G = (V+E), with
Repeat until the MST includes V−1V - 1V−1 edges edge weights w(u, v), and a source vertex s.
(where VVV is the number of vertices). Output : Shortest path distance from source s to all
Time complexity : O(E log E) or O(E log V) other vertices, or detection of a negative-weight.
Note : Works well with sparse graphs (fewer Time complexity : O(EV)
edges). May produce a forest if the graph is not
connected.
II. Prim’s minimum spanning tree algorithm
It builds the MST by growing it one vertex at a
time, always choosing the minimum-weight edge
that connects a vertex inside the MST to one
outside.
Start with a random vertex, Initialize a MST set
(vertices included in MST), and a priority queue (or
min-heap) of edge weights.
While the MST set does not include all vertices:
● Select the minimum-weight edge that connects a
vertex in the MST to a vertex outside.
Page No:- 10
ALGORITHMS
GATE फर्रे
Page No:- 11
ALGORITHMS
GATE फर्रे
Page No:- 12
ALGORITHMS
GATE फर्रे
Time Complexity:
Page No:- 13
ALGORITHMS
GATE फर्रे
Module 5 : Graph traversal Techniques Linear order of the vertices representing the activities
Visiting all nodes of the tree/Graph in a specified maintaining precedence.
order and processing the information only once.
Example below.
Page No:- 14
ALGORITHMS
GATE फर्रे
Page No:- 15
ALGORITHMS
GATE फर्रे
In-place vs Space required is generally O(1) or Selection Ω(n²) Θ(n²) O(n²) No Yes
Not-in-place O(log n) at most (for recursion stack) sort
Merge Sort → O(n) space
Bubble Ω(n) Θ(n²) O(n²) Yes Yes
Stable vs Relative order of same elements is sort
Unstable maintained (Stable)
Heap sort Ω(n Θ(n O(n No Yes
log n) log n) log n)
Page No:- 16
ALGORITHMS
GATE फर्रे
Page No:- 17