0% found this document useful (0 votes)
42 views31 pages

DAA Lecture # 6

The document discusses the merge sort algorithm. It explains the divide and conquer approach of merge sort and describes how it works by recursively dividing an array into halves and then merging the sorted halves. The time complexity of merge sort is analyzed to be O(n log n).

Uploaded by

Wahhab
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views31 pages

DAA Lecture # 6

The document discusses the merge sort algorithm. It explains the divide and conquer approach of merge sort and describes how it works by recursively dividing an array into halves and then merging the sorted halves. The time complexity of merge sort is analyzed to be O(n log n).

Uploaded by

Wahhab
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

Design & Analysis of Algorithms

Lecture # 6

1
Divide-and-Conquer
 Divide the problem into a number of subproblems

 Similar sub-problems of smaller size

 Conquer the sub-problems

 Solve the sub-problems recursively

 Sub-problem size small enough  solve the problems in

straightforward manner

 Combine the solutions to the sub-problems

 Obtain the solution for the original problem


Merge Sort Approach
To sort an array A[p . . r]:
Divide
Divide the n-element sequence to be sorted into two
subsequences of n/2 elements each
Conquer
Sort the subsequences recursively using merge sort
When the size of the sequences is 1 there is nothing more to
do
Combine
Merge the two sorted subsequences
Merge Sort Approach
Merge sort is based on the divide-and-
conquer paradigm.
 Divide Step
If a given array A has zero or one element, simply
return; it is already sorted. Otherwise, split Array
[p .. r] into two sub arrays A[p .. q] and A[q + 1 .. r],
each containing about half of the elements of A[p .. r].
Conquer Step
Conquer by recursively sorting the two sub
arrays A[p .. q] and A[q + 1 .. r].
Merge Sort Approach
Combine Step
Combine the elements back in A[p .. r] by merging the
two sorted sub arrays A[p .. q] and A[q + 1 .. r] into a
sorted sequence. To accomplish this step, we will
define a procedure MERGE (A, p, q, r).
Note:
The recursion bottoms out when the sub array has just
one element, so that it is trivially sorted.
Merge Sort Example

99 6 86 15 58 35 86 4 0
Merge Sort Example
99 6 86 15 58 35 86 4 0

99 6 86 15 58 35 86 4 0
Merge Sort Example
99 6 86 15 58 35 86 4 0

99 6 86 15 58 35 86 4 0

99 6 86 15 58 35 86 4 0
Merge Sort Example
99 6 86 15 58 35 86 4 0

99 6 86 15 58 35 86 4 0

99 6 86 15 58 35 86 4 0

99 6 86 15 58 35 86 4 0
Merge Sort Example
99 6 86 15 58 35 86 4 0

99 6 86 15 58 35 86 4 0

99 6 86 15 58 35 86 4 0

99 6 86 15 58 35 86 4 0

4 0
Merge Sort Example

99 6 86 15 58 35 86 0 4

Merge 4 0
Merge Sort Example

6 99 15 86 58 35 0 4 86

99 6 86 15 58 35 86 0 4

Merge
Merge Sort Example

6 15 86 99 0 4 35 58 86

6 99 15 86 58 35 0 4 86

Merge
Merge Sort Example
0 4 6 15 35 58 86 86 99

6 15 86 99 0 4 35 58 86

Merge
Merge Sort Example
0 4 6 15 35 58 86 86 99
Merge Sort Algorithm
Given a list L with a length k:

If k == 1  the list is sorted

Else:
Merge Sort the left side (0 through k/2)
Merge Sort the right side (k/2+1 through k)
Merge the right side with the left side
Merge Sort p q r
1 2 3 4 5 6 7 8

Alg.: MERGE-SORT(A, p, r) 5 2 4 7 1 3 2 6

if p < r Check for base case

then q ← (p + r)/2 Divide

MERGE-SORT(A, p, q) Conquer

MERGE-SORT(A, q + 1, r) Conquer

MERGE(A, p, q, r) Combine

 Initial call: MERGE-SORT(A, 1, n)


Merge - Pseudocode p q r
Alg.: MERGE(A, p, q, r) 1 2 3 4 5 6 7 8

2 4 5 7 1 2 3 6
1. Compute n1 and n2
2. Copy the first n1 elements into n1 L[1 n.2 .
n1 + 1] and the next n2 elements into R[1 . . n2 + 1]
3. L[n1 + 1] ← ; R[n2 + 1] ←  p q

L 2 4 5 7 
4. i ← 1; j ← 1
q+1 r
5. for k ← p to r
R 1 2 3 6 
6. do if L[ i ] ≤ R[ j ]
7. then A[k] ← L[ i ]
8. i ←i + 1
9. else A[k] ← R[ j ]
10. j←j+1
Merge - Pseudo code
1. Divide Array AB into two sub-arrays A & B
2. Sort A and B
3. Set a=1, b=1, c=1 to access the first element of A, B and AB resp.
4. r is the size of first array and s is the size of second array

5. While(a<=r && b<=s)


If(A[a]<B[b])
AB[c]=A[a]
c=c+1
a=a+1
Else
AB[c]= B[b]
c=c+1
b=b+1
end While
6. [for remaining elements in A or B]
If(a>r ) then
For i=0 to s-b
AB[c+i]=B[b+i]
else
For i=0 to r-a
AB[c+i]=A[a+i]
Analyzing the merge Sort

Best case=worst case=average case = O(n log n)


Analyzing the merge Sort (Recursion tree)
Conclusions
Θ(nlgn) grows more slowly than Θ(n2).

Therefore, merge sort asymptotically beats insertion


sort in the worst case.

In practice, merge sort beats insertion sort for n> 30


or so.
Example
Given Array

2 5 8 4 7 6 1 9 3

First Sub Array Second Sub Array

2 5 8 4 7 6 1 9 3
Example
Sorted Array Array

First Sorted Array First Sorted Array

2 4 5 8 1 3 6 7 9
Analysis

25
Proof Using Telescoping Method

26
Using Recursion Tree Method

27
Note Regarding Recursion Tree
Remember that in above running times logarithm of
base 2 is used.

Log2(8)=3 means three levels of recursion tree

Log2(16)=4 means four levels of recursion tree

Log2(32)=5 means five levels of recursion tree

28
Merge Sort Analysis Explanation
Assumption: N is a power of two.
For N = 1: time is a constant (denoted by 1)
Otherwise: time to mergesort N elements = time to
mergesort N/2 elements plus
time to merge two arrays each N/2 elements.
Time to merge two arrays each N/2 elements is linear,
i.e. N
Thus we have:
(1) T(1) = 1
(2) T(N) = 2T(N/2) + N

29
Merge Sort Analysis Explanation
we will solve this recurrence relation. First we divide (2) by N:
(3) T(N) / N = T(N/2) / (N/2) + 1
N is a power of two, so we can write
(4) T(N/2) / (N/2) = T(N/4) / (N/4) +1
(5) T(N/4) / (N/4) = T(N/8) / (N/8) +1
(6) T(N/8) / (N/8) = T(N/16) / (N/16) +1
(7) ……
(8) T(2) / 2 = T(1) / 1 + 1
Now we add equations (3) through (8) : the sum of their left-
hand sides will be equal to the sum of their right-hand sides:
30
Merge Sort Analysis Explanation
T(N) / N + T(N/2) / (N/2) + T(N/4) / (N/4) + … + T(2)/2 =
T(N/2) / (N/2) + T(N/4) / (N/4) + ….+ T(2) / 2 + T(1) / 1 + LogN
(LogN is the sum of 1s in the right-hand sides)

After crossing the equal term, we get


(9) T(N)/N = T(1)/1 + LogN
As T(1) is 0, hence we obtain
(10) T(N) = NlogN = O(NlogN)

Hence the complexity of the MergeSort algorithm is O(NlogN).

31

You might also like