0% found this document useful (0 votes)
12 views

10 - Searching & Sorting

Uploaded by

Rahul Raj
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

10 - Searching & Sorting

Uploaded by

Rahul Raj
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 110

DATA STRUCTURES

(CS3401)

Dr. Somaraju Suvvari,


Asst. Prof, Dept. of CSE, NIT Patna.
[email protected];
[email protected];
The Course

DATA STRUCTURES
SEARCHING
&
SORTING
UNIT VI: Searching & Sorting

Linear search, Binary search, Hashing.

Algorithms and data structures for sorting: Selection Sort, Bubble sort, Insertion
Sort, Merge sort, Quick Sort, Heap sort, Bucket sort.
SEARCHING

Searching – It is the process of looking/searching for a specific element in the given list of elements.

Searching Techniques –

1. Linear Search

2. Binary Search
Linear Search

Linear Search – It is the process of comparing the given searching element s with the each and every
element in the given list of elements A.

A is a list of elements,
Notation s is the element we wants to search in A.
Output: successful - position of the s in A.

• Compares searching element with each and every element of the array/List (data structure), if the
searching element s found in the list A then it returns the location of searching element from the list of
elements otherwise it returns the -1.
• Used mostly for unordered list of elements.
• For ex: int A[] = {10, 12, 23, 75, 3, 4, 9, 1, 8, 5, 14, 34, 56, 76, 31, 26};
searching element s = 8.
• s is found at position 9.
Linear Search

Algorithm – Linear Search


Linear_Serach(Element Type A[], Element Type s, int n) // n is the number of element in A

Step 1 – Let i := 0

Step 2: Compare the searching element s with the ith element in the list A.

Step 3 - If both are matched, then return i+1.

Step 4 - If both are not matched, then increment i. If i is less than n and then repeat the steps 2 and 3.

Step 5 - Return -1. // Element not found


Linear Search

Algorithm – Linear Search


Linear_Serach(Element Type A[], Element Type s, int n) // n is the number of element in A

Step 1: [INITIALIZE] SET pos = -1, i = 0


Step 2: Repeat Step 3 and Step 4 while i < n
Step 3: IF A[I] = s
SET POS = i + 1
PRINT “ Element found in position : POS”
Go to Step
[END OF IF]
Step 4: SET i = i + 1
[END OF LOOP]
Step 5: IF POS = –1
PRINT VALUE IS NOT PRESENT IN THE ARRAY
Time Complexity = O(n)
[END OF IF] // In the worst case it is making n comparisons
Step 6: EXIT
S = 12 Linear Search - Example
List of elements A [] ={ 56, 20, 40, 55, 32, 12, 50, 99 }
56 20 40 55 32 12 50 99 s = 12
[0] [1] [2] [3] [4] [5] [6] [7]
S = 12
S = 12
56 20 40 55 32 12 50 99
[0] [1] [2] [3] [4] [5] [6] [7] 56 20 40 55 32 12 50 99
S = 12 [0] [1] [2] [3] [4] [5] [6] [7]
56 20 40 55 32 12 50 99
[0] [1] [2] [3] [4] [5] [6] [7]
S = 12
Matched at index 5
6 comparisons are used.
56 20 40 55 32 12 50 99
[0] [1] [2] [3] [4] [5] [6] [7]
S = 12

56 20 40 55 32 12 50 99
[0] [1] [2] [3] [4] [5] [6] [7]
Complexity Analysis
• Best Case
Search element <56> is found at index 0.
56
• if searching element is found at first index. Time complexity is O(1).

56 20 40 55 32 12 50 99
• Worst Case: [0] [1] [2] [3] [4] [5] [6] [7]
• searching element is found at the last position
57

56 20 40 55 32 12 50 99
• Average Case: [0] [1] [2] [3] [4] [5] [6] [7]
• searching element is found at middle of the array
Search element <57> is not found.
Time complexity is O(n).
32

56 20 40 55 32 12 50 99
[0] [1] [2] [3] [4] [5] [6] [7]
Search element <32> is found at index 4.
Time complexity is O(n).
Limitations of Linear Search

• Linear search can not take the advantage in reducing the number of comparisons when the
elements are in sorting order.

99

10 12 20 32 50 55 65 99
[0] [1] [3] [5]
[2] [4] [6] [7]
Binary Search Low Middle = 4
TNC =1
High
65>50
• The input for the Binary search is the sorted 10 13 20 32 50 55 65 78 99
s = 65
elements (pre constraint).
[0] [1] [2] [3] [4] [5] [6] [7] [8]
• Idea - Compare searching element with the middle
element in the list. There are three possibilities Low 65=65 High
1. If the middle element is greater than searching Middle=6
element then the searching element may found only
10 13 20 32 50 55 65 78 99
in the first half of the elements (Searching array size
reduced to half). [0] [1] [2] [3] [4] [5] [6] [7] [8]
2. If the middle element is less than searching element TNC =2
then the searching element may found only in the
second half of the elements (Searching array size
reduced to half)
65 is found at index [6] with two
comparisons instead of 7 comparisons in
3. If the middle element matches with the searching
linear search
element, then return the position of the middle
element.

• The above three steps are repeated until it matches or


elements exhausted.. TNC – Total Number of comparisons
Binary Search - Example
S = 10 10<50 high
1
Mid=(0+8)/2=4
3
10 12 20 32 50 55 65 78 99
[0] [1] [2] [3] [4] [5] [6] [7] [8] 10=10
Mid=0

low 10 12 20 32 50 55 65 78 99
[0] [1] [2] [3] [4] [5] [6] [7] [8]
2
10<12
mid=(3+0)/2=1
low
10 12 20 32 50 55 65 78 99 high
[0] [1] [2] [3] [4] [5] [6] [7] [8]
N=8, Total number of comparisons = 3 (log 28)
low high
Binary Search - Example
S=8 8 < 50 high
1
Mid=(0+8)/2=4
3 Low = 0 High = 0
10 12 20 32 50 55 65 78 99
[0] [1] [2] [3] [4] [5] [6] [7] [8] 8 < 10
Mid=0

low 10 12 20 32 50 55 65 78 99

Low = 0 High = 3 [0] [1] [2] [3] [4] [5] [6] [7] [8]
2
8 < 12
mid=(3+0)/2=1
low
10 12 20 32 50 55 65 78 99 high
[0] [1] [2] [3] [4] [5] [6] [7] [8] 4 Low = 0 High = -1

low high No elements to search, so element not fund

N=8, Total number of comparisons = 3 (log 28)


Binary Search - Example
S = 80 80 > 50 high mid=(7+8)/2=7
1 3 Low = 7 High = 8
mid=(0+8)/2=4 80 > 78
10 12 20 32 50 55 65 78 99 10 12 20 32 50 55 65 78 99
[0] [1] [2] [3] [4] [5] [6] [7] [8] [0] [1] [2] [3] [4] [5] [6] [7] [8]

low high
low

2 Low = 5 High = 8
80 > 65
mid=(5+8)/2=6
10 12 20 32 50 55 65 78 99
[0] [1] [2] [3] [4] [5] [6] [7] [8]
high
low
Binary Search - Example

4 Low =8 High = 8

80 < 99
mid=(8+8)/2=8
10 12 20 32 50 55 65 78 99
[0] [1] [2] [3] [4] [5] [6] [7] [8]
low
high

4 Low = 9 High = 8
Low is greater than high,
Stops searching, N=8, Total number of comparisons = 4 (log 28 + 1)
Decodes that, Element not fund
Binary Search
Binary_Search(A, low, high, s)
Assumption: the data elements of list are in sorted order . // low is the lower index of the list, high
int Binary_Search(Element Type A[], Element Type s, int n) 1. Set i = low, j =high,
2. while(i <= j)
• Step 1 - Find the middle element in the sorted list. i. m = (low + high) / 2
ii. if (A[m] == s)
• Step 2 - Compare the search element with the middle element in the i. return m.
sorted list. iii. else if(A[i] > s)
• Step 3 - If both are matched, then display "Given element is found!!!" i. j = m – 1.
and terminate the function. iv. else
i. i = m + 1.
• Step 4 - If both are not matched, then check whether the search element 3. Return -1 // Element not found
is smaller or larger than the middle element.
• Step 5 - If the search element is smaller than middle element, repeat
steps 2, 3, 4, 5 and 6 for the left sub list of the middle element.
• Step 6 - If the search element is larger than middle element, repeat
steps 2, 3, 4 and 5 for the right sub list of the middle element.
• Step 7 - Repeat the process until we find the search element in the list or
until sub list contains only one element.
• Step 8 - If that element also doesn't match with the search element, then
display "Element is not found in the list!!!" and terminate the
function.
Cases-Binary Search Average Case
ele=99
Mid=4 1
Searching element
Best Case ele=50
Mid=4 10 12 20 32 50 55 65 78 99
[0] [1] [2] [3] [4] [5] [6] [7] [8]
10 12 20 32 50 55 65 78 99 50<99
[0] [1] [2] [3] [4] [5] [6] [7] [8] low high
Mid=6
50=50 2
high
low 10 12 20 32 50 55 65 78 99
Comparisons=1, it is considered as best [0] [1] [2] [3] [4] [5] [6] [7] [8]
case. Time complexity=O(1). 65<99
78<99 low 78<99 high
3 Mid=6 Mid=6

10 12 20 32 50 55 65 78 99 10 12 20 32 50 55 65 78 99
[0] [1] [2] [3] [4] [5] [6] [7] [8] [0] [1] [2] [3] [4] [5] [6] [7] [8]

low high 4
low high
Mid=4 Worst Case: element is not found
10 12 20 32 50 55 65 78 99
[0] [1] [2] [3] [4] [5] [6] [7] [8]

5<50
high

Mid=1 5<12 Mid=0 5<10

10 12 20 32 50 55 65 78 99 10 12 20 32 50 55 65 78 99
[0] [1] [2] [3] [4] [5] [6] [7] [8] [0] [1] [2] [3] [4] [5] [6] [7] [8]

high
low high low
Binary Search Time Complexity

Binary Search Time Complexity
Another way
T(n) = T(n/2) + 1 = T(n/21) + 1
T(n/2) = T(n/4) + 1 = T(n/22) + 1, So T(n) = T(n/2) + 1 = T(n/4) + 1 + 1 = T(n/ 22) + 1 + 1 = T(n/ 22) + 2
T(n/4) = T(n/8) + 1 = T(n/23) + 1, So T(n) = T(n/4) + 1 + 1 = T(n/8) + 1 + 1 +1 = T(n/ 23) + 1 + 1 + 1 = T(n/ 23) + 3
.
.
.
.
T(n/2k-1) = T(n/ 2k) + 1, So T(n) = T(n/ 2k) + 1 + 1 + 1 + 1 ……. + 1 = T(n/ 2k) + k
It stops when there is a single element, i.e., 2k = n // Stops when there is only one element or no elements
Taking log on both sides, k = logn
So, total number of comparisons are logn
T(n) = O(logn)
SORTING
Definition – Sorting refers to the operation of arranging data in some given order, such as increasing
or decreasing with numerical data, or alphabetically with character data.

Types of sorting –
1. Internal sorting - Sorts the data resides in the computer’s memory.
2. External Sorting - Deals with sorting the data stored in files. External sorting
is applied when there is large amount of data that cannot be stored in memory.

Example - Multiway merging


SORTING

Internal Sorting Methods –


1. Selection sort
2. Bubble sort
3. Insertion sort
4. Merge sort
5. Quick sort
6. Heap sort
7. Bucket Sort …….
SORTING - Terminology

In-place sorting - Any sorting algorithm is called In-place sorting algorithm if it uses constant space for sorting
the elements. It sorts the elements by changing the order of the elements within the given list.

Stable sorting - Any sorting algorithm is called stable sorting algorithm if two elements with equal keys appear in

the same order in sorted output as they appear in the input array to be sorted.
SELECTION SORTING

• The simplest technique to sort the given list of elements


Idea
1. Select the smallest element in the given list of n elements and place it in the first position.
2. Now Select the smallest element in the remaining list of n-1 elements and place it in the second
position.

• Repeat this procedure until the entire array is sorted.


Selection Sorting
Detailed procedure

• The first element in the list is selected and it is compared repeatedly with all the remaining
elements in the list. If any element is smaller than the selected element, then both are swapped, so
that first position is filled with the smallest element in the sorted order.

• Next, we select the element at a second position in the list and it is compared with all the
remaining elements in the list. If any element is smaller than the selected element, then both are
swapped.

• This procedure is repeated until the entire list is sorted.


Selection Sort - Example
Let A[] = {31, 52, 29, 87, 63, 27, 19, 54}, want to sort the elements are in asscending order, such that after sorting it
satisfies the property A[0] ≤ A[1] ≤ A[2] …… ≤ A[N]
Pass-1 – Bring the smallest element into position zero, by comparing the first element with the remaining elements, if you
found any smaller element than the first element then swap both of them, after end of the first pass the smallest element
will come to the first place
i j Is 52 < 31?
No. No swapping required
31 52 29 87 63 27 19 54
[0] [1] [2] [3] [4] [5] [6] [7]
i j Is 29 < 31? i j
Yes. Swap them.
31 52 29 87 63 27 19 54 29 52 31 87 63 27 19 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]

Is 63 < 29?
i j Is 87 < 29? i j
No. No swapping required
No. No swapping required
29 52 31 87 63 27 19 54 29 52 31 87 63 27 19 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]
Selection Sort - Example
Pass-1 – Bring the smallest element into position zero, by comparing the first element with the remaining elements, if you
found any smaller element than the first element then swap both of them, after end of the pass the smallest element will
come to the first place

Is 27 < 29?
i j Yes. Swapping required
i j

29 52 31 87 63 27 19 54 27 52 31 87 63 29 19 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]

i j
i j Is 19 < 27?
Yes. Swapping required
19 52 31 87 63 29 54
27 52 31 87 63 29 19 54
27
[0] [1] [2] [3] [4] [5] [6] [7]
[0] [1] [2] [3] [4] [5] [6] [7]

i Is 54 < 19? j
No. No Swapping required
After pass-1, the smallest element is in first position
19 52 31 87 63 29 54
27
[0] [1] [2] [3] [4] [5] [6] [7]
Selection Sort - Example
Pass-2 – Bring the second smallest element into position two, by comparing the second element with the remaining
elements, if you found any smaller element than the second element then swap both of them, after end of the second pass
the second smallest element will come to the second place

Is 31 < 52 ?
i j i j
Yes. Swapping required

19 52 31 87 63 27 29 54 19 31 52 87 63 27 29 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]

i j Is 87 < 31 ?
No. No Swapping required
19 31 52 87 63 27 29 54
[0] [1] [2] [3] [4] [5] [6] [7]

i j Is 63 < 31 ?
No. No Swapping required
19 31 52 87 63 27 29 54
[0] [1] [2] [3] [4] [5] [6] [7]
Selection Sort - Example
Pass-2 – Bring the second smallest element into position two, by comparing the second element with the remaining
elements, if you found any smaller element than the second element then swap both of them, after end of the second pass
the second element smallest will come to the second place
Is 27 < 31 ?
i j Yes. Swapping required i j

19 31 52 87 63 27 29 54 19 27 52 87 63 31 29 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]

i j Is 29 < 27 ?
No. No Swapping required
19 27 52 87 63 31 29 54
[0] [1] [2] [3] [4] [5] [6] [7]
Is 54 < 27 ?
i j No. No Swapping required

19 27 52 87 63 31 29 54 After pass-2, the second smallest element is in the second position


[0] [1] [2] [3] [4] [5] [6] [7]
Selection Sort - Example
Pass-3 – Bring the third smallest element into position three, by comparing the third element with the remaining elements,
if you found any smaller element than the third element then swap both of them, after end of the third pass the third
smallest element will come to the third place

i j Is 87 < 52 ?
No. No Swapping required
19 27 52 87 63 31 29 54
[0] [1] [2] [3] [4] [5] [6] [7]

i j Is 63 < 52 ?
No. No Swapping required
19 27 52 87 63 31 29 54
[0] [1] [2] [3] [4] [5] [6] [7]

i j Is 31 < 52 ? i j
Yes. Swapping required

19 27 52 87 63 31 29 54 19 27 31 87 63 52 29 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]
Selection Sort - Example
Pass-3 – Bring the third smallest element into position three, by comparing the third element with the remaining elements,
if you found any smaller element than the third element then swap both of them, after end of the third pass the third
smallest element will come to the third place
Is 29 < 31 ?
i j i j
Yes. Swapping required

19 27 31 87 63 52 29 54 19 27 29 87 63 52 31 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]

i j Is 54 < 29 ?
No. No Swapping required
19 27 29 87 63 52 31 54
[0] [1] [2] [3] [4] [5] [6] [7]

After pass-3, the third smallest element is in the third position


Selection Sort - Example
Pass-4 – Bring the fourth smallest element into position four, by comparing the fourth element with the remaining
elements, if you found any smaller element than the fourth element then swap both of them, after end of the fourth pass the
fourth smallest element will come to the fourth place

i j Is 63 < 87 ? i j
Yes. Swapping required
19 27 29 87 63 52 31 54 19 27 29 63 87 52 31 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]
i j Is 52 < 63 ? j
i
Yes. Swapping required
19 27 29 63 87 52 31 54 19 27 29 52 87 63 31 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]
i j Is 31< 52 ? i j
Yes. Swapping required

19 27 29 52 87 63 31 54 19 27 29 31 87 63 52 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]
Selection Sort - Example
Pass-4 – Bring the fourth smallest element into position four, by comparing the fourth element with the remaining
elements, if you found any smaller element than the fourth element then swap both of them, after end of the fourth pass the
fourth smallest will come to the fourth place
i j Is 54 < 31?
No. No Swapping required
19 27 29 31 87 63 52 54
[0] [1] [2] [3] [4] [5] [6] [7]

After pass-4, the fourth smallest element is in the fourth position


Selection Sort - Example
Pass-5 – Bring the fifth smallest element into position five, by comparing the fifth element with the remaining elements, if
you found any smaller element than the fifth element then swap both of them, after end of the fifth pass the fifth smallest
element will come to the fifth place

j Is 63 < 87?
i i j
Yes. Swapping required

19 27 29 31 87 63 52 54 19 27 29 31 63 87 52 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]

j Is 52 < 63 ?
i Yes. Swapping required i j
19 27 29 31 63 87 52 54 19 27 29 31 52 87 63 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]
Is 54 < 52?
i j No. No Swapping required

19 27 29 31 52 87 63 54
[0] [1] [2] [3] [4] [5] [6] [7] After pass-5, the fifth smallest element is in the fifth position
Selection Sort - Example
Pass-6 – Bring the sixth smallest element into position six, by comparing the sixth element with the remaining elements, if
you found any smaller element than the sixth element then swap both of them, after end of the sixth pass the sith smallest
element will come to the sixth place

j Is 63 < 87 ?
i i j
Yes. Swapping required
19 27 29 31 52 87 63 54 19 27 29 31 52 63 87 54
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]

Is 54 < 63?
i j Yes. Swapping required i j

19 27 29 31 52 63 87 54 19 27 29 31 52 54 87 63
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]

After pass-6, the sixth smallest element is in the sixth position


Selection Sort - Example
Pass-7 – Bring the seventh smallest element into position seven by comparing the seventh element with the remaining
elements, if you found any smaller element than the seventh element then swap both of them, after end of the seventh pass
the seventh smallest element will come to the seventh place
Is 63 < 87 ?
i j i j
Yes. Swapping required

19 27 29 31 52 54 87 63 19 27 29 31 52 54 63 87
[0] [1] [2] [3] [4] [5] [6] [7] [0] [1] [2] [3] [4] [5] [6] [7]

After pass-7, the seventh smallest element is in the seventh position

Finally 19 27 29 31 52 54 63 87
[0] [1] [2] [3] [4] [5] [6] [7]
ALGORITHM - Selection Sort
Selection_sort (Element Type A[],int n) // Assumption – index starts with 0
1. Set i = 0, j = i + 1
2. while(i < n-1) // Number of passes
i. set j = i + 1 // j starts from the next element of i
ii. while(j < n) // j go through the jth element to nth element
i. if (A[j] < A[i]) // if i pointed element is greater than j pointed element, then swap them
i. Swap (A[i], A[j])
ii. j = j + 1.
iii.i = i + 1.
3. Return.
Selection Sort– Time Complexity

Number of Number of Number of
Number of
comparisons comparisons comparisons
comparisons
in pass-1 in pass (n-2) in pass (n-1)
in pass-2
Selection Sort

• Can we reduce the number of swaps ?


• Yes
• How ?

• At pass k find the index of the kth smallest element and at the end of the pass k swap the k th element

with the element at the index of the kth smallest.

• Is selection sorting is in-place sorting algorithm?

• Yes

• Is selection sort is stable sorting algorithm ?

• No
Selection Sort
Advantages
1. Simple technique.
2. In-place sorting (no extra memory required)

Dis Advantages
3. Too many comparisons
Bubble sort

Repeatedly move the largest element to the highest index position of the array.

Bubble sort –

The algorithm does two steps:

1. Starts at one end of the array and make repeated scans through the list comparing
successive pairs of elements (Adjacent elements).

2. If the first element is larger than the second, called as an inversion, then the values are
swapped.
Bubble sort

Idea – The algorithm works by repeatedly stepping through the list to be sorted,
comparing each pair of adjacent items and swapping them if they are in the wrong
order.

The pass through the list is repeated until no swaps are needed.

This technique is called bubble sort or sinking sort because the smaller values gradually

"bubble" their way upward to the top of the array like air bubbles rising in water, while
the

larger values sink to the bottom of the array.


Example - Sorting the given numbers using Bubble Sort
Pass - 1
• 3 5 2 6 8 1 9 (No change)

• 3526819 (Swap 2 and 5)

• 3 2 5 6 8 1 9 (No change)

• 3 2 5 6 8 1 9 (No change)

• 3 2 5 6 8 1 9 (Swap 8 and 1)

• 3 2 5 6 1 8 9 (No change)
• 3256189
(Observation – After pass-1 the largest element reached to its position and the smaller elements are
moving slowly up)
(The number of comparisons are 6, i.e. (n-1))
Example - Sorting the given numbers using Bubble Sort
Pass-2
• 3 2 5 6 1 8 9 (Swap 3 and 2)

• 2 3 5 6 1 8 9 (No change)

• 2 3 5 6 1 8 9 (No change)

• 2 3 5 6 1 8 9 (Swap 6 and 1)

• 2 3 5 1 6 8 9 (No change)

• 2351689

(Don’t require to compare the last element with its previous element – why?)

(Observation – The second largest reached to its position and smaller elements slowly moving up)

(The number of comparisons are 5, i.e. (n-2))


Example - Sorting the given numbers using Bubble Sort
Pass-3
• 2 3 5 1 6 8 9 (No change)

• 2 3 5 1 6 8 9 (No change)

• 2 3 5 1 6 8 9 (Swap 5 and 1)

• 2 3 1 5 6 8 9 (No change)

2315689
(Observation – The third largest element reached to its position and smaller elements slowly moving up)

(The number of comparisons are 4, i.e. (n-3))


Example - Sorting the given numbers using Bubble Sort
Pass-4
• 2 3 1 5 6 8 9 (No change)

• 2 3 1 5 6 8 9 (Swap 3 and 1)

• 2 1 3 5 6 8 9 (No change)

• 2135689

(Observation – The fourth largest element reached to its position and smaller elements slowly
moving up)

(The number of comparisons are 3, i.e. (n-4))


Example - Sorting the given numbers using Bubble Sort
Pass-5
• 2 1 3 5 6 8 9 (Swap 2 and 1)

• 1 2 3 5 6 8 9 (No change)

1235689

(Observation – The fifth largest element reached to its position and the smaller
elements slowly moving up)

(The number of comparisons are 2, i.e. (n-5))


Example - Sorting the given numbers using Bubble Sort
Pass-6
• 1 2 3 5 6 8 9 (No change)

1235689

(Observation – The sixth largest element reached to its position, so the last
element (seventh one here) is also reached its position)

(The number of comparisons are 1, i.e. (n-6))


Example - Sorting the given numbers using Bubble Sort
Pass - 1
• 3 5 6 7 8 9 (No change)

• 3 5 6 7 8 9 (No change)

• 3 5 6 7 8 9 (No change)

• 3 5 6 7 8 9 (No change)

• 3 5 6 7 8 9 (No change)

Observation – How to stop when all the elements are already in the sorted order?
Answer – If there is no swapping happened in any pass then stop the remining passes.
ALGORITHM - Bubble Sort

Bubble_Sort (Element Type A[], int n)


1. Set i = 0 // Assumption – index starts with 0
2. while (i < n-1) // Number of passes
i. set j = 0, flag = 0 // flag stores the status of swaps in each pass
ii. while(j < n – i - 1) // Number of comparisons in ith pass
i. if (A[j] > A[j+1]) // Compare jth element with j+1th element
i. Swap (A[j], A[j+1]) // If jth element is greater than j+1th element then swap them
ii. flag = 1 // If swap happened then change the status
ii. j = j + 1.
iii. if flag == 0 // If no swap happened in ith pass then stop the remining passes
i. break;
iv. i = i + 1.
3. Return.
Bubble Sort – Time Complexity (average and worst cases)

Number of Number of Number of
Number of
comparisons comparisons comparisons
comparisons
in pass-1 in pass (n-2) in pass (n-1)
in pass-2
Bubble Sort – Time Complexity (Best case)

Number of
comparisons
in pass-1
Bubble Sort
Advantages
1. It can detect whether the input is already sorted or not
2. In-place sorting (no extra memory required)

Dis Advantages
3. Too many comparisons
INSERTION SORT
The idea of insertion sort is similar to the idea of sorting playing cards

Pick a card and place it at the right place by moving cards to the right.

• Elements are considered in sorted and unsorted order.

• Every iteration moves an element from unsorted portion to sorted portion until all the elements are sorted in the list.

• The insertion sort algorithm is performed using the following steps...

• Step 1 - Assume that first element in the list is in sorted portion and all the remaining elements are in
unsorted portion.

• Step 2: Take first element from the unsorted portion and insert that element into the sorted portion in the
order specified.

• Step 3: Repeat the above process until all the elements from the unsorted portion are moved into the sorted
portion.
Insertion Sort
• The insertion sort scans input array a from a[1] to a[n], inserting each element a[k] into its
proper position in the previously sorted sub array a[1], a[2], …., a[k-1].

• a[1] by itself is trivially sorted.

• A[2] is inserted either before or after a[1], so that a[1], a[2] is sorted.

• a[3] is inserted in its proper place in a[1], a[2], that is, before a[1], between a[1] and a[2], or
after a[2], so that a[1], a[2], a[3] is sorted.

…….

• a[n] is inserted into its proper place in a[1], a[2], …., a[n-1], so that a[1], a[2], ….., a[n] is
sorted.
Insertion Sort - Example
Example: a[] = {3, 7, 4, 9, 5, 2, 6, 1}.
In each step, the key under consideration is underlined.

3 (a[1] by itself is trivially sorted)


37 (7 is greater than its previous element, so no work to be done)
374 (4 is less than its previous element, so move 4 to its previous place)
34 7 (4 is greater than its previous element, so no work to be done)
3479 (9 greater than its previous element, so no work to be done)
34795 (5 is less than its previous element, so move 5 to its previous place)
34759 (5 is less than its previous element, so move 5 to previous place)
34579 (5 is greater than its previous element, so no work to be done)
Insertion Sort - Example
Example: a[] = {3, 7, 4, 9, 5, 2, 6, 1}.
In each step, the key under consideration is underlined.

345792 (2 is less than its previous element, so move 2 to previous place)


345729 (2 is less than its previous element, so move 2 to previous place)
34 5 279 (2 is less than its previous element, so move 2 to previous place)
3 4 2579 (2 is less than its previous element, so move 2 to previous place)
324579 (2 is less than its previous element, so move 2 to previous place)
234579
2345796 (6 is less than its previous element, so move 2 to previous place)
2345769 (6 is less than its previous element, so move 2 to previous place)
2345679 (6 is greater than its previous element, so no work to be done)
Insertion Sort - Example
Example: a[] = {3, 7, 4, 9, 5, 2, 6, 1}.
In each step, the key under consideration is underlined.

23456791 (1 is less than its previous element, so move 2 to previous place)


2345671 9 (1 is less than its previous element, so move 2 to previous place)
2345617 9 (1 is less than its previous element, so move 2 to previous place)
23451679 (1 is less than its previous element, so move 2 to previous place)
2341567 9 (1 is less than its previous element, so move 2 to previous place)
2314567 9 (1 is less than its previous element, so move 2 to previous place)
2134567 9 (1 is less than its previous element, so move 2 to previous place)
1234567 9
1234567 9
ALGORITHM – Insertion Sort
• Algorithm

//Assumption is that index starts with 0


• int a[100] ; //declare array a of size 100
• Input n // n is used to sort how many numbers
• for i = 0 to n-1 // read the array elements
• Read a[i]; i++;
• for i = 1 to n-1

• for (k = i; k > 0 and a[k] < a[k-1]; k--)

• swap a[k] and a[ k-1] // invariant: a[1..i] is sorted

• output a[1] to a[n]


Insertion Sort– Time Complexity (average and worst cases)

Number of Number of Number of
Number of
comparisons comparisons comparisons
comparisons
in pass-1 in pass (n-2) in pass (n-1)
in pass-2
Insertion Sort– Time Complexity (Best case)

Number of Number of Number of Number of


comparisons comparisons comparisons comparisons
in pass-1 in pass-2 in pass (n-2) in pass (n-1)
Insertion Sort
Advantages
1. If the input is sorted [may not be completely] then it will do only n+d
comparisons, where d is the number of inversions.
2. Insertion sort can sort the list as it receives [online]
3. Practically more efficient than selection and bubble sort.
4. In-place sorting (no extra memory required)
5. Stable sorting.

Dis Advantages
6. Too many comparisons (O(n2)) if the input is sorted in reverse order.
Merge Sort
• Merge sort is a sorting algorithm that uses the divide, conquer, and combine
algorithmic paradigm.

Divide and Conquer – It is an algorithm design technique to solve problems. It


follows three steps:
1. Divide – Divide the given problem into smaller problems

2. Conquer – Conquer the subproblems recursively

3. Combine solutions of subproblems into one for the original problem

(Examples – Binary search, Merge sort, Quick sort, etc)


Merge Sort
• Merging is the process of combining two sorted lists to make one bigger sorted list

Basic Idea of Merge sort


1. Divide the given input list into two halves
(Apply this step on the divided sub lists till the sub list size reached to one)

2. Recursively sort each sub list.

3. Merge the two sorted sub lists (From bottom to top).


Idea is to sort the size one sub lists and merge them, Apply this procedure for sizes 2, 4, 8, ……
Merge Sort (Example)
Sort the following eight elements using merge sort: 4, 2, 56, 8, 34, 1, 98, 23
1. Divide the given input list into two halves
A[1, 4 2 56 8 34 1 98 23
8]

A[1, 4] 4 2 56 8 34 1 98 23 A[5, 8]

A[1,2] 4 2 56 8 A[5, 34 1 98 23 A[7,8]


A[3,4] 6]

A[4,4]
A[1,1] 4 A[2,2] 2 A[3,3] 56 8 A[5,5] 34 A[6,6] 1 A[7,7] 98 23 A[8,8]
Merge Sort (Example)
2. Recursively sort each sub list and Merge the two sorted sub lists (From bottom to top).

56 A[4,4] 8 A[5,5] 34 A[6,6] 1 A[7,7] 98 A[8,8] 23


A[1,1] 4 A[2,2] 2 A[3,3]

Merge
Merge Merge
Merge
T[] T[] 23 98
T[] 8 56 1 34
T[] 2 4
Copy Copy
Copy Copy
A[5,6] 1 34 A[7,8] 23 98
A[1,2] 2 4 A[3,4] 8 56
Merge Merge

T[] 2 4 8 56 T[] 1 23 34 98
Copy
Copy
A[1,4] 2 4 8 56 A[5,8] 1 23 34 98

Merge
T[] 1 2 4 8 23 34 56 98
Copy
A[1,8] 1 2 4 8 23 34 56 98
Merge Sort (pseudo code)
void Merge_Sort(int A[], int low, int high) // A – input array, low – lower index, high – higher index
{
if (low < high) // If there are more than one element
{
int mid = (low + high) / 2;
Merge_Sort(A, low, mid); // call the merge sort on first half of the elements of A
Merge_Sort(A, mid+1, high); // call the merge sort on second half of the elements of A
Merge(A, low, mid, high); // Merge the sorted elements of the first half and second half
}
}
Merge Sort (pseudo code)
void Merge(int A[], int low, int mid, int high) // A – input array, low – lower index, high – higher index
{ int i, j, k, t[high-low+1], n = high – low + 1; // t is a temporary array
// i used for indexing array t; j and k used for indexing on array A
for(i=0, j = low, k = mid+1; j <= mid && k <= high; )
{ // if j pointed element is smaller than k pointed element then copy it into array t otherwise copy k pointed element
if (A[j] <= A[k]) {t[i++] = A[j++];}
else {t[i++] = A[k++];}
}
while(j<=mid) { t[i++] = A[j++];} // Copy the remaining elements in first half of A to t
while(k<=high) t[i++] = A[k++];} // Copy the remaining elements in second half of A to t
//copy the sorted elements into array A into its respective positions
i=0, j = low;
while(i < n)
{ A[j++] = t[i++];}
}
Merge Sort (pseudo code – Explanation with example)
1. Merge_Sort(A, 0, 3)
2. Merge_Sort(A, 0, 1)
2.1 Merge_Sort(A, 0, 0)
2.2 Merge_Sort(A, 1, 1)
2.3 Merge(A, 0, 0, 1)
3. Merge_Sort(A, 2, 3)
3.1 Merge_Sort(A, 2, 2)
3.2 Merge_Sort(A, 3, 3)
3.3 Merge(A, 2, 2, 3)
4. Merge(A, 0, 1, 3)
Merge Sort Time Complexity

Merge Sort Time Complexity
Another way
T(n) = 2T(n/2) + n = 21T(n/21) + n
T(n/2) = 2T(n/4) + n/2 , So T(n) = 2T(n/2) + n = 2(2T(n/4) + n/2) + n = 22 T(n/ 22) + 2n
T(n/4) = 2T(n/8) + 1, So T(n) = 4T(n/4) + 2n = 4 (2T(n/ 8) + n/4) + 2n = 23 T(n/ 23) + 3n
.
.
.
.
T(n/2k-1) = 2T(n/ 2k) + 1, So T(n) = 2kT(n/ 2k) + kn
It stops when there is a single element, i.e., 2k = n // Stops when there is only one element or no elements
Taking log on both sides, k = logn
So, total number of comparisons are n logn
T(n) = O(n logn)
Merge Sort Space Complexity
• At each iteration it takes temporary memory when doing the merging operation.

• It takes temporary arrays of size 2, 4, 8, …., n.

• So the merge sort takes extra memory of size n.

• Space Complexity of Merge Sort is


T(n) = O(n)
Quick Sort
• Quick sort is also follows the divide, and conquer algorithmic paradigm. It is also
called as partition-exchange sort.

• Idea
• Select randomly one element (Call it as pivot) from the elements to sort (generally first element)

• Partition the given array into two subsists such that one list contains all the elements less than or
equal to chosen element (pivot element) and the another list contains the elements greater than
chosen element (pivot element)
• Now the pivot element is in its sorted position.

• Apply the same procedure on the sub lists till there are no sub lists.
Quick Sort
• Partition procedure
1. Select the first element in the list as pivot element.
2. Initialize one pointer called it as left at the first element of the list, and initialize another pointer
called it as right at the last element of the list.
3. Repeatedly move the left pointer to next position until the left pointer pointed element is less
than or equal to pivot element or the pointer reached to last position.
4. Repeatedly move the right pointer to its previous position until the right pointer pointed element
is greater than pivot element or the pointer reached to first position.
5. If left < right // if they do not cross each other
1. Swap the elements pointed by left and right and continue the steps 3 and 4
6. If left >= right // if they cross each other or they both points same position
1. Swap the right pointed element and the pivot element.
Quick Sort (Example)
[23 19 12 11 25 4 89 34] Remarks

PIVOT RIGHT Move the left pointer until it finds the element greater than pivot
LEFT
LEFT
LEFT
LEFT

LEFT Move the right pointer until it finds the element smaller than or
equal to pivot
RIGHT

RIGHT Swap left and right pointed elements

23 19 12 11 4 25 89 34 Move the left pointer until it finds the element greater than pivot

LEFT Move the right pointer until it finds the element smaller than or
equal to pivot
RIGHT Swap the pivot element with right pointer pointed element

[4 19 12 11] [23] 89 34]


[25
Now apply the partition procedure on left sub list of 23
Quick Sort (Example continuation) Remarks
[4 19 12 11] [23] [25 89 34]
PIVOT RIGHT Move the left pointer until it finds the element greater than pivot
LEFT
LEFT Move the right pointer until it finds the element smaller than or
equal to pivot
RIGHT

RIGHT

RIGHT Swap the pivot element with right pointed element

[19 12 11] [23] [25 89 34]


[4]
Now apply the partition procedure on right sub list of 4 (as left
sub does not have any elements)

[4] [19 12 11] [23] [25 89 34]


PIVOT RIGHT Move the left pointer until it finds the element greater than pivot
LEFT
LEFT
LEFT Move the right pointer until it finds the element smaller than or
equal to pivot
Swap the pivot element with right pointed element

[11 12] [19] [ 23] [ 25 89 34]


[4]
Now apply the partition procedure on left sub list of 19
Quick Sort (Example continuation)
11 12 [19] [23] [25 89 34] Remarks
[4]
PIVOT RIGHT Move the left pointer until it finds the element greater than pivot
LEFT
LEFT Move the right pointer until it finds the element smaller than or
equal to pivot
RIGHT

Swap the pivot element with right pointed element

[4] [11] 12 [19] [23] [25 89 34] Now apply the partition procedure on right sub list of 11 (as
left sub does not have any elements)

[4] [11] [12] [19] [23] [25 89 34] There is only one element in the right sub list, so automatically is
int its position
PIVOT RIGHT Now apply the partition procedure on right sub list of 23 (as
LEFT left sub list already in sorted order)
LEFT Move the left pointer until it finds the element greater than pivot

RIGHT Move the right pointer until it finds the element smaller than or
equal to pivot
RIGHT Swap the pivot element with right pointed element

[4] [11] 12 [19] [ 23] [25] [89 34]


Now apply the partition procedure on right sub list of 25 (as
left sub does not have any elements)
Quick Sort (Example continuation)
[4] [11] 12 [19] [ 23] [25] [89 34] Remarks
PIVOT Move the left pointer until it finds the element greater than pivot
LEFT RIGHT
LEFT Move the right pointer until it finds the element smaller than or
equal to pivot

[4] [11] 12 [19] [ 23] [25] 34 [89] Swap the pivot element with right pointed element

[4] [11] 12 [19] [ 23] [25] [34] [89] Now apply the partition procedure on left sub list of 89 (as
right sub list does not have any elements)
There is only one element in the left sub list, so automatically it is
in its position
Quick Sort (pseudo code)
void Quick_Sort(int A[], int low, int high) // A – input array, p is the lower limit of A and q is the higher limit of A
{
if (low < high) // If there are more than one element
{
int p = Partition(A, low, high);
Quick_Sort(A, low, p-1); // call the merge sort on first half of the elements of A
Quick_Sort(A, p+1, high); // call the merge sort on second half of the elements of A
}
}
Quick Sort (pseudo code)
int Partition(int A[], int low, int high) // A – input array, low – lower index, high – higher index
{ int pivot = A[low], left = low, right = high;
while(left < right)
{ while( left < high && A[left] <= pivot) {left++;}
while( right > low && A[right] > pivot) {right--;}
if(left < right)
swap( &A[left], &A[right]);
}
swap(&A[low], &A[right]);
return right;
}
Pivot Selection strategies
Strategy -1
The popular one is selecting the first element as pivot element. This is good if input
is random, but it is not good if the input is pre-sorted or sorted in reverse order.

Strategy -2
The second strategy is choose the pivot randomly. It is the one of the preferable
one.

Strategy -3 (Median of three partitioning)


Pick three randomly and use the median of these three elements.
Quick Sort Time Complexity

Quick Sort Time Complexity

Quick Sort Time Complexity

Bucket Sort
Bucket sort is applied when the input is uniformly distributed over a range and even it is used in the case
where there are floating point numbers in a fixed range.

Idea:

1. Sorts the elements by first dividing the given elements into several groups called buckets.

2. The elements in each bucket are sorted using any of the suitable sorting algorithms or by recursively
calling the same algorithm.

3. Collect the elements from the first bucket to last bucket


Bucket Sort
For example consider the elements: 56, 23, 46, 3, 1, 89, 45, 73. [the elements are in the range 1 to 100]

Assume we use four buckets [1-25 in one bucket, 26-50 in second bucket, 51-75 in third bucket and remining in fourth bucket]

Bucket-1 Bucket-2 Bucket-3 Bucket-4


1 46 56 89
3 45 73
23

Now sort the elements in each bucket using the suitable sorting algorithm (Quick sort or insertion sort, or any other sorting algorithm)
Bucket-1 Bucket-2 Bucket-3 Bucket-4
1 45 56 89
3 46 73
23

Now collect the elements from the buckets: 1, 3, 23, 45, 46, 56, 73, 89
Bucket Sort (Pseudo code)
#define B 100
void Bucket_Sort(int A[], int n, int p)
//n is the number of elements, p is the maximum allowed element (For example if our input range is 1 to 100 then p is 100
{ int i, j, k;
int buckets[B];
for(j=0; j< n; j++)
{ Insert(A[i], buckets[A[i]/(p/B)]); } // Insert the element A[i] in the bucket A[i]/(p/B)
for(j=0; j<B; j++)
Sort(B[j]); // Sort the elements in bucket j
k=0;
for(j=0; j<B; j++)
k = Concatenate(A, k, B[j], k); //k is the number of elements in A, after concatenating the first bucket
//elements to array A k is updated with the number of elements in Bucket j
}
Bucket Sort Complexity
Best Case and average case
If the elements in buckets are already sorted and we used insertion sort or elements are
distributed evenly
T(n) = O(n) //if n >> k, where k is the number of buckets

Worst case
If most of the elements are fall in fewer buckets and elements are already in the sorted
reverse order and we used insertion sort
T(n) = O(n2)

Space Complexity
T(n) = O(nk)
Hashing
Hashing - Hashing is a technique used for storing and retrieving information as quickly
as possible.

Why Hashing?

Most of the searching algorithms like linear search, Binary search do the searching
operation in linear or logarithmic time. We want to do more quicker than these algorithms.
Hashing is a technique do more faster than these algorithms (most or the times we want to
search in constant time).
Hashing
For example we want an algorithm for knowing the first repeated character in a given string?

Solution -1: If the given string stored in the array, compare each character with every other
character in the array, if any your comparison is successful first time, then we can conclude that
present comparing character is the one first repeated on the in the given string. The complexity of
this algorithm is O(n2).

Solution -2: Create an array of size 256 (since there are totally 256 distinct ASCII characters) and
initialize all with zeros. For each of the input character in the given string go to its corresponding
position in the array and increment the value. Since we are using arrays and it takes constant time
for reaching its location. When incrementing first time whose count is two, then we conclude that
this is the first repeating character in the given string. The complexity of this algorithm is O(n).

The second solution uses here a hashing technique.


Components of Hashing
1. Hash Table – An array to store the elements (The common convention to have the table run
from 0 to TableSize-1).

2. Hash Function - It is a specific method for calculating the array index from the element. Each
key (key is a string with a n associated value) is mapped into some number in the range 0 to
TableSize-1 and placed in the appropriate cell. The mapping is called hash function. For example
the following is a common hash function used in most of the applications where the input keys
are integers:

Key % TableSize

3. Collision Resolution Technique - Algorithms and Data Structures to handle the situation such
that if two keys are hashed into the same array index (Must give alternative location for the
second one).
Hashing
⮚ An efficient hash function should be designed so that it distributes the index values of
inserted elements uniformly across the table.

⮚ An efficient collision resolution technique must compute an alternative index for a key
whose hash index corresponding to a location previously inserted in the hash table.

⮚ We must choose a hash function which can be calculated quickly, returns value with in
the range of locations in our table and minimizes collisions.
Hashing (Example)
Store the following elements in a hash table of size 8 using any suitable hash function

2, 7, 13, 16, 25

We use the following hash function:

h(x) = x % TableSize 0
0 0 1
1
h(2) = 2 % 8 = 2 1
2 2
2 2 2 3
h(7) = 7 % 8
3 3
4
4 4 5
=7
5 5 6
6 6 7 7
7 7
Hashing (Example)
2, 7, 13, 16, 25
0
1
h(13) = 13 % 8 = 5
2 2 0
1
h(16) = 16 % 8 = 0
3
2 0 16
4 2
1
h(25) = 25 % 8 = 1
5 3
2 2 0 16
6 4
3 1 25
7 7 5 13
4 2 2
6
7 7 5 13 3
6 4
7 7 5 13
6
7 7
Collision Resolution Techniques
Collision – “When two or more keys hashing into the same location”

Collision Resolution Techniques


• Direct Chaining
• Separate Chaining

• Open Addressing
• Linear Probing

• Quadratic Probing

• Double Hashing
Collision Resolution Techniques – Separate Chaining
Separate Chaining - When two or more elements has to the same location, then these records are
constituted into a single linked list called as chain.

For example consider the following set of elements: 20, 9, 17, 39, 43, 78, 69, 23, 12, 33, 48, 4, 7.

Hash TableSize = 10 and hash function h(x) = x % TableSize.


0 20

1
2 12

3 43 23 33
4 4
5
6
17 7
7
8 78 48
9 9 39 69
Collision Resolution Technique – Linear Probing
Separate chaining has the limitation of requiring pointers. This results in slow the algorithm
down a bit because of the time required to allocate the new cells, and also require the
implementation of a second data structure.

Open addressing hashing is an alternative technique to resolve the collisions. When a


collision occurs an alternative location is tried until an empty location is found.

Linear Probing - We search the hash table sequentially starting from the original hash
location till we find the free location.
Rehash(x) = (x+1) % TableSize
= (x+2) % TableSize
.
.
.
Linear Probing (Example)
Consider the elements: 79, 18, 49, 58, 69
Hash TableSize = 10 and h(x) = x % TableSize
Empty Table
1
0
1
2
3
4
5
6
7
8
9
Linear Probing (Example)
Consider the elements: 79, 18, 49, 58, 69
Hash TableSize = 10 and h(x) = x % TableSize
Empty Table After inserting 79
1 2
0
0
1
1
2
2
3
3
4
4
5
5
6
6
7
7
8
8
9 79
9
Linear Probing (Example)
Consider the elements: 79, 18, 49, 58, 69
Hash TableSize = 10 and h(x) = x % TableSize
Empty Table After inserting 79 After inserting 18
1 2 3
0 0
0
1 1
1
2 2
2
3 3
3
4 4
4
5 5
5
6 6
6
7 7
7
8 8 18
8
9 79 9 79
9
Linear Probing (Example)
Consider the elements: 79, 18, 49, 58, 69
Hash TableSize = 10 and h(x) = x % TableSize
Empty Table After inserting 79 After inserting 18 For inserting 49,
1 2 3 there is a collision (49 % 10 = 9 (not free
location), 0 0
0
1 1 so rehash it rehash(49) = (49+1) % 10 = 0 (free location)
1 4 0
2 2 49
2 1
3 3
3 2
4 4 3
4
5 5
5 4
6 6
6 5
7 7
7 6
8 8 18
7
8
9 79 9 79
9 8
9 18
79
Linear Probing (Example)
Consider the elements: 79, 18, 49, 58, 69
For inserting 58,
there is a collision (58 % 10 = 8 (not a free location),
so rehash it rehash(58) = (58+1) % 10 = 9 (not a free location)
0
49 so rehash it again rehash(58) = (58+2) % 10 = 0 (not a free location)
1 so rehash it again rehash(58) = (58+3) % 10 = 1 (free location)
5 6
2 0 0
3 49 49
1 For inserting 69, 1
4 2 58 there is a collision (69 % 10 = 9 (not a free location), 2 58
5 3 so rehash it rehash(69) = (69+1) % 10 = 0 (not a free location) 3
6 so rehash it again rehash(69) = (69+2) % 10 = 1 (not a free location)
4 4 69
so rehash it again rehash(69) = (69+3) % 10 = 2 (free location)
7 5 5
8 6 6
18
9 7 7
79
8 8
9 9
18
79 18
Linear Probing (Example)
Consider the elements: 79, 18, 49, 58, 69
For inserting 58,
there is a collision (58 % 10 = 8 (not a free location),
so rehash it rehash(58) = (58+1) % 10 = 9 (not a free location)
0
49 so rehash it again rehash(58) = (58+2) % 10 = 0 (not a free location)
1 so rehash it again rehash(58) = (58+3) % 10 = 1 (free location)
5 6
2 0 0
3 49 49
1 For inserting 69, 1
4 2 58 there is a collision (69 % 10 = 9 (not a free location), 2 58
5 3 so rehash it rehash(69) = (69+1) % 10 = 0 (not a free location) 3
6 so rehash it again rehash(69) = (69+2) % 10 = 1 (not a free location)
4 4 69
so rehash it again rehash(69) = (69+3) % 10 = 2 (free location)
7 5 5
8 6 6
18
9 7 7
79
8 8
9 9
18
79 18
Collision Resolution Technique – Quadratic Probing
Quadratic Probing - Start from the original hash function, we check the location i + 1 2,
i + 22, i + 32 +……..
Rehash(x) = (x+12) % TableSize
= (x+22) % TableSize
.
.
.
Quadratic Probing (Example)
Consider the elements: 79, 18, 49, 58, 69
Hash TableSize = 10 and h(x) = x % TableSize
Empty Table After inserting 79 After inserting 18 For inserting 49,
1 2 3 there is a collision (49 % 10 = 9 (not free
location), 0 0
0
1 1 so rehash it rehash(49) = (49+12) % 10 = 0 (free location)
1 4 0
2 2 49
2 1
3 3
3 2
4 4 3
4
5 5
5 4
6 6
6 5
7 7
7 6
8 8 18
7
8
9 79 9 79
9 8
9 18
79
Quadratic Probing (Example)
Consider the elements: 79, 18, 49, 58, 69
For inserting 58,
there is a collision (58 % 10 = 8 (not a free location),
so rehash it rehash(58) = (58+12) % 10 = 9 (not a free location)
0
49 5 so rehash it again rehash(58) = (58+22) % 10 = 2 (free location)
1 0
49 6
2 1 0
3 2 For inserting 69, 1 49
4 3 there is a collision (69 % 10 = 9 (not a free location), 2
58 so rehash it rehash(69) = (69+12) % 10 = 0 (not a free location)
5 4 3
so rehash it again rehash(69) = (69+22) % 10 = 3 (free location) 58
6 5 4
7 6 5 69
8 7 6
18
9 8 7
79
9 8
18
9
79
18
Collision Resolution Technique – Double Hashing
Double Hashing- Use second hash function whenever the collision occurs.

h2 ≠ h1

use h1(x) + h2(x) or h1(x)+i*h2(x), ….etc.

Here i starts from 1 and increases for each collision.

Popular second hash function is

R – (x mod R)

Here R is any prime number


Heap Sort will be discussed after covering priority
queues and trees
Thank You

You might also like