Module5 Algorithms
Module5 Algorithms
Problem Solving
Reference
Introduction to Algorithms
by
T.H. Cormen
C.E. Leiserson
R.L. Rivest
C. Stein
Graphs
• A graph G = (V, E)
– V = set of vertices
– E = set of edges = subset of V V
– Thus |E| = O(|V|2)
Graphs
• We will typically express running times in
terms of |E| and |V| (often dropping the |’s)
– If |E| |V|2 the graph is dense
– If |E| |V| the graph is sparse
• If you know you are dealing with dense or
sparse graphs, different data structures may
make sense
Graph Searching
• Given: a graph G = (V, E), directed or
undirected
• Goal: methodically explore every vertex and
every edge
• Ultimately: build a tree on the graph
– Pick a vertex as the root
– Choose certain edges to produce a tree
– Note: might also build a forest if graph is not
connected
Minimum Spanning Tree
• Problem: given a connected, undirected,
weighted graph, find a spanning tree using
edges that minimize the total weight
6 4
5 9
14 2
10
15
3 8
Prim’s Algorithm
6 4 4
u
MST-Prim(G, w, r)
5 9
Q = V[G]; 5 2 9
for each u Q
key[u] = ; 14 2
10
key[r] = 0; 15
0 8 15
p[r] = NULL;
3 3 8
while (Q not empty)
u = ExtractMin(Q);
for each v Adj[u]
if (v Q and w(u,v) < key[v])
p[v] = u;
key[v] = w(u,v);
Single-Source
Shortest Path
• Problem: given a weighted directed graph G,
find the minimum-weight path from a given
source vertex s to another vertex v
– “Shortest-path” = minimum weight
– Weight of path is sum of edges
– E.g., a road map: what is the shortest path from
Kolkata to Jaipur?
Shortest Path Properties
• In graphs with negative weight cycles, some
shortest paths will not exist (Why?):
<0
Relaxation
• A key technique in shortest path algorithms is
relaxation
– Idea: for all v, maintain upper bound d[v] on (s,v)
Relax(u,v,w) {
if (d[v] > d[u]+w) then d[v]=d[u]+w;
}
2 2
5 9 5 6
Relax Relax
2 2
5 7 5 6
Bellman-Ford Algorithm
BellmanFord()
Initialize d[], which
for each v V
will converge to
d[v] = ; shortest-path value
d[s] = 0;
for i=1 to |V|-1 Relaxation:
for each edge (u,v) E Make |V|-1 passes,
relaxing each edge
Relax(u,v, w(u,v));
for each edge (u,v) E Test for solution
if (d[v] > d[u] + w(u,v)) Under what condition
do we get a solution?
return “no solution”;
U 0
(universe of keys)
h(k1)
k1
h(k4)
k4
K
k5
(actual h(k2) = h(k5)
keys)
k2 h(k3)
k3
m-1
Resolving Collisions
• How can we solve the problem of collisions?
• Solution 1: chaining
• Solution 2: open addressing
Open Addressing
• Basic idea:
– To insert: if slot is full, try another slot, …, until an
open slot is found (probing)
– To search, follow same sequence of probes as would
be used when inserting the element
• If reach element with correct key, return it
• If reach a NULL pointer, element is not in table
• Good for fixed sets (adding but no deletion)
– Example: spell checking
• Table needn’t be much bigger than n
Chaining
• Chaining puts elements that hash to the same
slot in a linked list: T
U ——
(universe of keys) k1 k4 ——
——
k1
——
k4 k5
K ——
(actual k5 k2 k7
k7 ——
keys)
——
k2 k3
k8 k3 ——
k6
k8 k6 ——
——
Design Paradigms
• Greedy
• Dynamic Programming
• Divide and Conquer
Greedy algorithms
• A greedy algorithm always makes the choice that
looks best at the moment
• The hope: a locally optimal choice will lead to a
globally optimal solution
• For some problems, it works
• Greedy algorithms tend to be easier to code
• Many examples in scheduling
• Huffman Coding
Divide and Conquer Vs.
Dynamic Programming