0% found this document useful (0 votes)
6 views

03 Sorting Algorithms

Uploaded by

ssmukherjee2013
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

03 Sorting Algorithms

Uploaded by

ssmukherjee2013
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 60

Sorting Algorithms

Contents
• Insertion sort
• Bubble sort
• Selection sort
• Shell sort
• Merge sort
• Quick sort
• Heap sort
• Radix sort
2
Insertion Sort
• If a new element is inserted in the correct place
in a sorted list, the list will remain sorted
• Insertion Sort starts with a single element list
• Adding a second element in the proper place
keeps the list sorted
• Repeat until all elements are inserted into the
list at their correct place
• Sorted list grows in size after adding of each new
element
3
Insertion Sort Example

Sorted already Not yet processed

4
Insertion Sort Algorithm
Begin
Repeat for i = 2 to N step 1
Set new := list[ i ]
Set loc := i - 1
Repeat while (location ≥ 1) and (list[loc] > new)
Set list[loc + 1] := list[loc]
// shift list[loc] one position to the right
Set location := location - 1
End Repeat
Set list[ location + 1 ] := new
End Repeat
End

5
Analysis of Insertion Sort
• In Place sorting – no additional space required for
sorting
• Best Case – when the list is already sorted, only N
comparisons are required; O(N)
• Worst Case – when the list is sorted in reverse order,
N2 exchanges are required along with about half
number of comparisons; O(N2)
• Average Case – Same as Worst Case
• Only In-Place soring method suitable for Linked Lists

6
Bubble Sort
• If pairs of adjacent elements in a list are
compared with each other and none are out of
order, the list is sorted
• If any pair is out of order, those elements can be
swapped to get an ordered list
• Bubble sort makes multiple passes through a list
swapping any adjacent elements that are out of
order

7
Bubble Sort
• After the first pass, the largest element is
positioned at the end of list
• After the second pass, the second largest
element is positioned before the largest element
• Because of this, each successive pass of the
comparison loop can be shortened by one

8
Bubble Sort Example

9
Bubble Sort Algorithm
Begin
Set numberOfPairs := N
Set swappedElements := true
Repeat while (swappedElements is true)
Set numberOfPairs := numberOfPairs - 1
Set swappedElements := false
Repeat for i = 1 to numberOfPairs step 1
If (list[ i ] > list[ i + 1 ]) then
Swap( list[i], list[i + 1] )
Set swappedElements := true
End if
End Repeat
End Repeat
End 10
Improvements over Original Bubble Sort
• Reducing the number of elements to covered in
the inner loop by 1 for each cycle – as the
largest elements are moved to the end
• Checking whether any elements have been
swapped in each cycle – if not, the list is already
sorted and no further loops are necessary

11
Bubble Sort Analysis
• In Place Sort
• Time performance very similar to Insertion
Sort
• Average Case and Worst Case : O(N2)
• Best Case: O(N)

12
Selection Sort
• If we can identify smallest element and move it
to the start of it, and do it repeatedly for all
elements, the list is sorted
• Selection sort passes though the list comparing
elements such they reach their final position in
one exchange
• This sort is similar to Bubble Sort but does more
comparisons and less number of exchanges

13
Selection Sort Algorithm
Begin
Repeat for i = 1 to N Step 1
Set current := i
Repeat for location = i to N step 1
If list [location] < list [current]
// min element identified
Set current := location
End if
End Repeat
If ( i <> current)// if swap required
swap ( list [i], list [current] )
End if
End Repeat
End
14
Selection Sort Analysis
• Worst Case Performance in the same range as
Bubble Sort – O(N2)
• Performs N exchanges and N2 comparisons as
against N2 exchanges and N2 comparisons of
Bubble Sort
• Used in situations when exchanging elements is
more expensive than comparing them, for
example, lists with large records

15
Shell Sort
• Shell Sort treats the list as a combination of
interleaved sublists
• For example, elements in the even locations
could be one list and elements in the odd
locations the other list
• Shell sort begins by sorting many small lists, and
increases their size and decreases their number
as it continues

16
Shell Sort
• One technique uses lists with decreasing powers
of 2; for example, if the list has 64 elements, the
first pass uses 32 sublists of 2 elements each, the
second pass uses 16 sublists of 4 elements each,
and so on
• Other list sizes have also been used – no specific
number works best in all list sizes
• These sublists are sorted using Insertion Sort

17
Shell Sort Example
8 sublists

2 elements / sublist

Increment = 8

4 sublists

4 elements / sublist

Increment = 4

2 sublists

8 elements / sublist

Increment = 2

1 sublist

16 elements / sublist

18
Increment = 1
Shell Sort Algorithm
Begin
Set passes := log2 N

Repeat while (passes > 1)


Set increment := 2passes - 1
Repeat for start = 1 to increment step 1
InsertionSort(list, N, start, increment)
End Repeat
Set passes := passes - 1
End Repeat
End

19
Shell Sort Analysis
• Performance of the algorithm changes
dramatically with different values of increment
• With increments that are one less than powers of
2, the worst-case has been shown to be O(N3/2)
• An order of O(N4/3) can be achieved with other
increments
• Still undergoing active research and analysis
• Efficient In-Place sorting method for large lists

20
Merge Sort
• If there are two sorted lists, a combined sorted
list can be created by merging the lists
• Merge Sort starts with breaking down an
unsorted list to single elements
• Single elements are combined to form sorted
pairs – which then are combined to form larger
sorted lists
• It is a Divide and Conquer algorithm – divides a
large list into smaller units and builds the
complete list by combining them
21
Merge Sort Example

22
Merge Sort Example (continued)

23
Merge Sort Example (continued)

24
Merge Sort Algorithm
MergeSort(list, first, last)
Begin
If first < last then
Set middle = ( first + last ) / 2
MergeSort( list, first, middle ) // L-Part
MergeSort( list, middle + 1, last ) // R-Part
MergeLists( list, first, middle, middle + 1, last )
End if
End
// list: the list of elements to be put in order
// first: the index of the 1st element in the part of list to sort
// last: the index of the last element in the part of list to sort
// if first=last, list has only one element

25
MergeList Algorithm (Part 1)
MergeLists( list, start1, end1, start2, end2) // start1: beginning of list A

Begin // end1: end of list A

Set finalStart := start1


// start2: beginning of list B
Set finalEnd := end2
Set indexC := 1 // end1: end of list B

Repeat while (start1 ≤ end1) and (start2 ≤ end2)


// result: list C
If (list[start1] < list[start2]) then
Set result[indexC] := list[start1]
Set start1 := start1 + 1
Else
Set result[indexC] := list[start2]
Set start2 := start2 + 1
End if
Set indexC := indexC + 1
End Repeat

26
MergeList Algorithm (Part 2)

If start1 ≤ end1 then // leftover in first part


Repeat for i = start1 to end1 step 1
Set result[indexC] := list[i]
Set indexC := indexC + 1
End Repeat
Else // leftover in second part
Repeat for i = start2 to end2 step 1
Set result[indexC] := list[i]
Set indexC = :indexC + 1
End Repeat
End if
Set indexC := 1 // Put the sorted list back
Repeat for i = finalStart to finalEnd step 1
Set list[i] := result[indexC]
Set indexC := indexC + 1
End Repeat
End
27
Merge Sort Analysis
• The worst case: O (N log N)
• The best case: O (N log N)
• Performance in the same order as Heap sort
• Requires additional space when used with
arrays (for storing merged list)
• Best algorithm for sorting with Linked Lists
(no additional space required)

28
Quicksort

• Another Divide and Conquer algorithm,


Quicksort uses a partition process to split a list
repeatedly
• Initially two lists are made from the given list via
partitioning using a Pivot element
• All elements to the left of Pivot are smaller than
Pivot and all elements to the right are larger
• The process is repeated with smaller lists
• No merging is required (unlike Merge Sort)

29
Quicksort Algorithm

Quicksort(list, first, last)


Begin
If first < last then
Set p := PivotList(list, first, last)
// p: the location of the pivot
// the pivot is at its final position
Quicksort(list, first, p-1) // L-Part
Quicksort(list, p+1, last) // R-Part
End if
End
Note:

(first == last)  only one element

(first > last)  no more element

30
Partitioning Process
• Partitioning is the critical element of Quicksort
algorithm
• During Partitioning, the list is scanned element by
element comparing values to the pivot
• During this process, there are sections of the list as
indicated below
not yet scanned

p i

31
Partitioning Process

1. 1st element in the sub-list, say A[1], is used as the pivot

2. Two indices i and p are used as

– p (pivotpoint) points to the pivot element

– i (index) points to the next element to be processed

3. List A is scanned from left starting from i = 2, and every element A[i] is compared with pivot:

if (A[i] < pivot)

Increment p, swap (A[i], A[p])

else increment i and continue this process

4. Finally, swap (A[1], A[p]). The pivot in this sub-list is now in its correct position

32
Quicksort Example

See the PivotList Algorithm for details

33
Quicksort Example

34
PivotList Algorithm
PivotList(list, first, last)
Begin
Set PivotValue := list[ first ]
Set pivot := first
Repeat for index = first+1 to last step 1
If (list[ index ] < PivotValue)
Set pivot := pivot + 1
If ( pivot <> index ) // don’t swap if same
Swap( list[pivot], list[index] )
End if // This for loop does

//(last-first) iterations
End if
End Repeat
// move pivot value into correct place
Swap(list[first], list[pivot])
return pivot // Returns position of pivot
End
index : incremented at every step

pivot : incremented only when list[index] < PivotValue


35
PivotList Algorithm (Alternate)
PivotList(list, first, last)
Begin
Set PivotValue := list[ first ]
Set lower := first+1
Set upper := last
Repeat while lower <= upper
Repeat while list[lower] <= PivotValue
Set lower := lower + 1
End Repeat
Repeat while list[upper] => PivotValue
Set upper := upper - 1
End Repeat
If lower < upper
Swap( list[lower], list[upper] )
End If
End Repeat
Swap(list[first], list[upper])
Return upper // Returns position of pivot
End 36
Quicksort Analysis
• PivotList makes N – 1 comparisons
• In the worst case, it can create one partition that
has N – 1 elements and the other will have no
elements
• Hence, worst case scenario: O (N2)
• Average Case and Best Case: O (N log N)
• Preferred to other sorting methods because it is
substantially faster in typical applications

37
Heap Sort

• A heap is a tree where for each subtree, the


value stored at the root is larger than all other
values stored in the subtree
• This tree is called a MaxHeap – root of the tree
is the largest node
• Other type of heap is MinHeap, where root is
smallest of all elements

38
Heap Details
• A heap is also a complete binary tree, so nodes
are filled in along the bottom of the tree from
left to right and a new level is started only when
the previous level has been filled

• There is no ordering between the children of


any node

39
Heap Example

40
Heap Storage
• Heap is stored in an array
• For an element stored at location i (i >0), its
children are stored at locations 2i and 2i+1
• If 2i and 2i+1 are greater than the list size, then
the element at location i is a leaf
• If only 2i+1 is greater than the list size, then the
element at location i has just one child

41
Heapsort
• Heapsort begins by constructing a heap

• The root (the largest value in the heap) is


moved to the last location of the list and the
heap size is reduced by one

• The heap is ‘fixed’ and the process is repeated

42
Heapsort Algorithm
Step 1: Construct the heap
Step 2: Start the sorting process
for i = 1 to N do
Swap the root with the last location of the current heap
Reduce heap size (N) by 1
Fix the heap with new size
end for

Both steps use FixHeap algorithm:

FixHeap(list, low, key, high) {…} // see next slide


// list: the list / heap being sorted
// low: the index of the root of the heap
// key: the key value that needs to be reinserted in the heap
// high: the upper limit of the heap

43
FixHeap Algorithm
FixHeap (list, low, key, high)
v
Begin
Set vacant := low // v
Repeat while 2*vacant ≤ high
Set largerChild := 2*vacant // 2v 2v 2v+1

If (largerChild < high) and


(list[largerChild+1] > list[largerChild])
determine the index of
Set largerChild := largerChild + 1
End if the larger child
If (key > list[largerChild])
Key is at the correct place
break
Else
Set list[ vacant ] := list[ largerChild ]
Set vacant := largerChild // either 2v or 2v+1
move the larger child up and
End if
continue the process down
End Repeat
Set list[ vacant ] := key
End
44
Constructing the Heap
Starting from the last internal node (i = N/2)

Ending with the root ( i = 1)

for i = N/2 down to 1 do


FixHeap( list, i, list[i], N )
end for

root key bound

45
Heap Construction Example
the last internal node

Index 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
i=8 2i = 16

no change

46
Final Heapsort Algorithm
// Step 1: Constructing the heap from the initial
list
Repeat for i = N/2 to 1 step -1
FixHeap( list, i, list[i], N )
End Repeat

// Step 2: Sorting on the constructed heap


Repeat for i = N to 2 step -1
Set max := list[1]
FixHeap( list, 1, list[i], i-1 )
Set list[i] := max
End Repeat

47
Final Heapsort Loop - 1

1
1
1 1

3
3
3
3

6
6
6

48
Final Heapsort Loop - 2

49
Heapsort Analysis
• FixHeap places each node at appropriate level with
respect to its parent node
• For a binary heap, if N is the number of nodes, the
depth D <= log N
• In Worst Case Situation, for N nodes, number of
operations are (N* ( log N)) or O(N log N)
• Best Case is also O (N log N)
• Number of operations for building a heap is O(N)
• Best In-Place sorting algorithm – suitable for arrays
but not Linked Lists
50
Radix Sort
• This sort is unusual because it does not directly
compare any of the elements
• Instead creates a set of buckets and repeatedly
separate the elements into the buckets
• On each pass, different part of the elements are
examined and sorted

51
Radix Sort
• Assuming decimal elements and 10 buckets, we
would put the elements into the bucket
associated with its units digit
• The buckets are actually queues so the elements
are added at the end of the bucket
• At the end of the pass, the buckets are
combined in increasing order

52
Radix Sort
• On the second pass, we separate the elements
based on the “tens” digit, and on the third pass
we separate them based on the “hundreds” digit

• Each pass must make sure to process the


elements in order and to put the buckets back
together in the correct order

53
Radix Sort Example

The unit digit is 0

The unit digit is 1

 The unit digit is 2

 The unit digit is 3

54
Radix Sort Example (continued)

The unit digits are already in order

Now start sorting the tens digit

55
Radix Sort Example (continued)
The unit and tens digits are already in order

Now start sorting the hundreds digit

Values in the buckets are now in order

56
Algorithm to sort a set of numeric keys

Begin
shift = 1
Repeat for pass = 1 to keysize do
Repeat for entry = 1 to N do
bucketNumber = (list[entry] / shift) mod 10
Append( bucket[bucketNumber], list[entry] )
End Repeat
list = CombineBuckets()
shift = shift * 10
End Repeat
End
keysize: # of digits of the longest key

N: # of elements in list

bucketNumber: lies between 0 and 9

57
Radix Sort Analysis

• Each element is examined once for each of the


digits it contains, so if the elements have at most M
digits and there are N elements this algorithm has
order O(M*N)
• This means that sorting is linear based on the
number of elements
• Why then isn’t this the only sorting algorithm
used?

58
Radix Sort Analysis
• Firstly, this sort can only be applied to inputs that
are based on known ranges of values (radixes)
• Also, this is very time efficient algorithm but not
space efficient
• If an array is used for the buckets, for B buckets,
N*B extra memory locations are required because
it’s possible for all of the elements may wind up in
one bucket

59
The End

You might also like