07 Searching
07 Searching
J. Kizito
Makerere University
e-mail: [email protected]
www: https://www.socnetsolutions.com/~jona
materials: https://www.socnetsolutions.com/~jona/materials/CSC1204
e-learning environment: http://muele.mak.ac.ug
office: block A, level 3, department of computer science
alt. office: institute of open, distance, and eLearning
Searching Algorithms
1 Searching Algorithms
Linear Search
Binary Search
Interpolation Search
Hashing
2 Algorithm Comparison
3 Algorithm Implementation
Searching Algorithms
Popular Algorithms
1 Linear (Sequential) search
2 Binary (Half-interval) search
3 Interpolation search
4 Hashing
Searching Algorithms
Linear Search
Search through the whole list from one end to the other
Starting at the first item in the list, we simply move from item to
item, following the underlying sequential ordering until we either find
what we are looking for or run out of items
If we run out of items, we have discovered that the item we were
searching for was not present
Best case: 1; Worst case: n; Average case: n/2
Linear Search
Algorithms (1)
Unordered List
def sequentialSearch(alist, item):
pos = 0
found = False
return found
Linear Search
Algorithms (2)
Ordered List
def orderedSequentialSearch(alist, item):
pos = 0
found = False
stop = False
while pos < len(alist) and not found and not stop:
if alist[pos] == item:
found = True
else:
if alist[pos] > item:
stop = True
else:
pos = pos+1
return found
Kizito (Makerere University) CSC 1204 March, 2024 6 / 22
Searching Algorithms Binary Search
Searching Algorithms
Binary Search
Split the list into two roughly equal sub-lists and search one of the halves
Makes clever comparisons by taking advantage of the ordered (sorted) list
1 We start by examining the middle item. If that item is the one we are
searching for is, say, greater than the middle item, we eliminate the lower
half and the middle item. The search item should belong to the upper half
3 We then repeat the process with the selected half
1st comparison leaves about n/2 items; 2nd → n/4; 3rd → n/8; . . . ; mth (last) → 1
At the mth comparison, we have n
2m
= 1. Solving for m, gives O(logn)
Kizito (Makerere University) CSC 1204 March, 2024 7 / 22
Searching Algorithms Binary Search
Binary Search
Algorithms (1)
return found
Kizito (Makerere University) CSC 1204 March, 2024 8 / 22
Searching Algorithms Binary Search
Binary Search
Algorithms (2)
Recursive Version
def binarySearch(alist, item):
if len(alist) == 0:
return False
else:
midpoint = len(alist)/2
if alist[midpoint]==item:
return True
else:
if item<alist[midpoint]:
return binarySearch(alist[:midpoint],item)
else:
return binarySearch(alist[midpoint+1:],item)
Searching Algorithms
Interpolation Search
Searching Algorithms
Hashing
Hashing
Sample Hash Functions (1)
Note that 6 of the 11 slots are now occupied. This is referred to as the load factor,
and is commonly denoted by λ = numberofitems(n)
tablesize(m)
6
= 11
Assuming we have a new value of 44? This is referred to as a collision
A hash function that maps each item into a unique slot is referred to as a perfect
hash function
Kizito (Makerere University) CSC 1204 March, 2024 12 / 22
Searching Algorithms Hashing
Hashing
Sample Hash Functions (2)
Hashing
Comparison of Remainder, Mid-Square, and Folding Methods
Hashing
Comparison of Remainder, Mid-Square, and Folding Methods
Hashing
Collision Resolution
2 Plus 3
3 Quadratic Probing
4 Chaining
Collision Resolution
Linear Probing
Start at the original hash value position and sequentially try to find the next open
slot in the hash table. Note that we may need to go back to the first slot
(circularly) to cover the entire table
Original set of items: 54, 26, 93, 17, 77, and 31:
Extended set: 54, 26, 93, 17, 77, 31, 44, 55, 20
Once a hash table is built using a given method, it is essential that we utilize the
same methods to search for items
E.g., Assuming we are looking for 20? The hash value is 9, which is holding 31.
We cannot simply return False since we know that there could have been collisions
Kizito (Makerere University) CSC 1204 March, 2024 16 / 22
Searching Algorithms Hashing
Collision Resolution
Plus 3
One way to deal with clustering is to extend the technique so that instead of
looking sequentially for the next open slot, we skip some slots, thereby more evenly
distributing the items that have caused collisions
“Plus 3” probe means that once a collision occurs, we will look at every third slot
until we find one that is empty
The process of looking for another slot after a collision is called rehashing
To ensure that part of the table is never unused, it is often suggested that the
table size be a prime number
Kizito (Makerere University) CSC 1204 March, 2024 17 / 22
Searching Algorithms Hashing
Collision Resolution
Quadratic Probing
Collision Resolution
Chaining
Algorithm Comparison
Algorithm Comparison
Number of operations (y axis) required to obtain a result as the number of elements (x axis) increases. O(n!) is the worst – it
requires 720 operations for just 6 elements, while O(1) is the best complexity – 1 operation for any number of elements
Source: http://bigocheatsheet.com/
Graph Traversal
See powerpoint with details...
Breadth-First Search
Visits all the nodes at one level of the graph before proceeding to the
next level
Returns the path containing the least number of nodes (the
shallowest path)
Depth-First Search
Performs the pre-order traversal of the graph and returns the leftmost
path
This path could luckily be the shallowest or the deepest if we are
unlucky
Kizito (Makerere University) CSC 1204 March, 2024 22 / 22