0% found this document useful (0 votes)
4 views

Unit 5

Uploaded by

deepti_chaudhari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Unit 5

Uploaded by

deepti_chaudhari
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

UNIT 5

SEARCHING AND SORTING

BY :
PROF : DEEPTI A. CHAUHDARI
CONTENTS
 SORTING TECHNIQUES
 SEARCHING METHODS
 SKIP LIST
 FILE HANDLING
1. BUBBLE SORT
Bubble sort works on the repeatedly swapping of adjacent elements until they are not in the
intended order. It is called bubble sort because the movement of array elements is just like the
movement of air bubbles in the water. Bubbles in water rise up to the surface; similarly, the array
elements in bubble sort move to the end in each iteration.
Although it is simple to use, it is primarily used as an educational tool because the
performance of bubble sort is poor in the real world. It is not suitable for large data sets. The
average and worst-case complexity of Bubble sort is O(n 2), where n is a number of items.
Bubble short is majorly used where –
 complexity does not matter
 simple and shortcode is preferred
if (array[i] > array[i + 1])
{
// swapping occurs if elements
// are not in the intended order
int temp = array[i];
array[i] = array[i + 1];
array[i + 1] = temp;
}
Bubble Sort program
Void bubblesort(int a[], int n)
{
int i, j, temp;
for(i=0;i<n;i++)
{
for(j=0;j<n;j++)
{
if(a[j]<a[i])
{
temp=a[i];
a[i]=a[j];
a[j]=temp;
}
}
}
Bubble sort Complexity
The time complexity of bubble sort in the best case, average case, and worst case.
1. Time Complexity
Case Time Complexity
Best Case O(n)
Average Case O(n2)
Worst Case O(n2)
 Best Case Complexity - It occurs when there is no sorting required, i.e. the array
is already sorted. The best-case time complexity of bubble sort is O(n).
 Average Case Complexity - It occurs when the array elements are in jumbled
order that is not properly ascending and not properly descending. The average
case time complexity of bubble sort is O(n 2).
 Worst Case Complexity - It occurs when the array elements are required to be
sorted in reverse order. That means suppose you have to sort the array elements in
ascending order, but its elements are in descending order. The worst-case time
complexity of bubble sort is O(n2)
SELECTION SORT
In selection sort, the smallest value among the unsorted elements of the array is selected in every pass
and inserted to its appropriate position into the array. It is also the simplest algorithm. It is an in-place
comparison sorting algorithm. In this algorithm, the array is divided into two parts, first is sorted part, and
another one is the unsorted part. Initially, the sorted part of the array is empty, and unsorted part is the
given array. Sorted part is placed at the left, while the unsorted part is placed at the right.
In selection sort, the first smallest element is selected from the unsorted array and placed at the first
position. After that second smallest element is selected and placed in the second position. The process
continues until the array is entirely sorted.
The average and worst-case complexity of selection sort is O(n 2), where n is the number of items.
Due to this, it is not suitable for large data sets.
Selection sort is generally used when -
 A small array is to be sorted
 Swapping cost doesn't matter
 It is compulsory to check all elements
void selection_sort(int a[], int n)
{
int i,j,small, temp;
for(i=0;i<n-1;i++)
{
small=i;
for(j=i+1;j<n;j++)
{
if(a[j]<a[small])
{
small=j;
}
}
temp=a[small];
a[small]=a[i];
a[i]=temp;
}
}
Selection sort Time Complexity

Best Case Complexity - It occurs when there is no sorting required, i.e.


the array is already sorted. The best-case time complexity of selection sort
is O(n2).
Average Case Complexity - It occurs when the array elements are in
jumbled order that is not properly ascending and not properly descending.
The average case time complexity of selection sort is O(n 2)
Worst Case Complexity - It occurs when the array elements are required
to be sorted in reverse order. That means suppose you have to sort the
array elements in ascending order, but its elements are in descending order.
The worst-case time complexity of selection sort is O(n 2).
INSERTION SORT
This is an in-place comparison-based sorting algorithm.
Here, a sub-list is maintained which is always sorted. For
example, the lower part of an array is maintained to be
sorted. An element which is to be 'inserted in this sorted sub-
list, has to find its appropriate place and then it has to be
inserted there. Hence the name, insertion sort.
The array is searched sequentially and unsorted items
are moved and inserted into the sorted sub-list (in the same
array). This algorithm is not suitable for large data sets as its
average and worst case complexity are of Ο(n2), where n is
the number of items.
Implementation of Insertion Sort
int main() j=j-1;
{ }
int i, j, n, temp, a[10]; a[j+1]=temp;
printf("How many numbers u are }
going to enter?: "); printf("Order of Sorted elements: ");
scanf("%d",&n); for(i=0;i<n;i++)
printf("Enter %d elements: ", n); printf(" %d",a[i]);
for(i=0;i<n;i++) { return 0;
scanf("%d",&a[i]); } }
for(i=1;i<n;i++)
{
temp=a[i];
j=i-1;
while((temp<a[j])&&(j>=0))
{
a[j+1]=a[j];
Insertion Sort Time Complexity

 Best Case Complexity - It occurs when there is no sorting


required, i.e. the array is already sorted. The best-case time
complexity of insertion sort is O(n).
 Average Case Complexity - It occurs when the array
elements are in jumbled order that is not properly ascending
and not properly descending. The average case time
complexity of insertion sort is O(n2).
 Worst Case Complexity - It occurs when the array elements
are required to be sorted in reverse order. That means
suppose you have to sort the array elements in ascending
order, but its elements are in descending order. The worst-
case time complexity of insertion sort is O(n2).
Quick Sort
 Sorting is a way of arranging items in a systematic manner. Quicksort is the widely used sorting
algorithm that makes n log n comparisons in average case for sorting an array of n elements. It is a faster
and highly efficient sorting algorithm. This algorithm follows the divide and conquer approach. Divide
and conquer is a technique of breaking down the algorithms into sub problems, then solving the sub
problems, and combining the results back together to solve the original problem.
 Divide: In Divide, first pick a pivot element. After that, partition or rearrange the array into two sub-
arrays such that each element in the left sub-array is less than or equal to the pivot element and each
element in the right sub-array is larger than the pivot element.
 Conquer: Recursively, sort two subarrays with Quicksort.
 Combine: Combine the already sorted array.
Quicksort picks an element as pivot, and then it partitions the given array around the picked pivot
element. In quick sort, a large array is divided into two arrays in which one holds values that are smaller
than the specified value (Pivot), and another array holds the values that are greater than the pivot.
After that, left and right sub-arrays are also partitioned using the same approach. It will continue until
the single element remains in the sub-array.
 Choosing the pivot
Picking a good pivot is necessary for the fast implementation of quicksort. However, it is typical to
determine a good pivot. Some of the ways of choosing a pivot are as follows -
 Pivot can be random, i.e. select the random pivot from the given array.
 Pivot can either be the rightmost element of the leftmost element of the given array.
 Select median as the pivot element.
1. Time Complexity
Best Case Complexity - In Quicksort, the best-case occurs when the pivot element is the
middle element or near to the middle element. The best-case time complexity of quicksort is
O(n*log n).
Average Case Complexity - It occurs when the array elements are in jumbled order that is not
properly ascending and not properly descending. The average case time complexity of
quicksort is O(n*log n).
Worst Case Complexity - In quick sort, worst case occurs when the pivot element is either
greatest or smallest element. Suppose, if the pivot element is always the last element of the
array, the worst case would occur when the given array is sorted already in ascending or
descending order. The worst-case time complexity of quicksort is O(n 2).

2. Space Complexity
Space Complexity O(n*logn)
Program
void quicksort(int a[], int first, int last) a[j]=temp;
{ quicksort(a,first,j-1);
int i,j,pivot,temp; quicksort(a,j+1,last);
}
if(first<last) }
{
pivot=first;
i=first;
j=last;
while(i<j)
{
while(a[i]<=a[pivot] && i<last)
{
i++;
}
while(a[j]>a[pivot])
{
j--;
}
if(i<j)
{
temp=a[i];
a[i]=a[j];
a[j]=temp;
}
}
temp=a[pivot];
a[pivot]=a[j];
Merge Sort
 Merge sort is similar to the quick sort algorithm as it uses the divide and conquer approach
to sort the elements. It is one of the most popular and efficient sorting algorithm. It divides
the given list into two equal halves, calls itself for the two halves and then merges the two
sorted halves. We have to define the merge() function to perform the merging.
 The sub-lists are divided again and again into halves until the list cannot be divided further.
Then we combine the pair of one element lists into two-element lists, sorting them in the
process. The sorted two-element pairs is merged into the four-element lists, and so on until
we get the sorted list.
Time Complexity
 Best Case Complexity - It occurs when there is no sorting required, i.e. the array is already
sorted. The best-case time complexity of merge sort is O(n*logn).
 Average Case Complexity - It occurs when the array elements are in jumbled order that is
not properly ascending and not properly descending. The average case time complexity of
merge sort is O(n*logn).
 Worst Case Complexity - It occurs when the array elements are required to be sorted in
reverse order. That means suppose you have to sort the array elements in ascending order,
but its elements are in descending order. The worst-case time complexity of merge sort is
O(n*logn).
Space Complexity
The space complexity of merge sort is O(n). It is because, in merge sort, an extra variable is
required for swapping.
Merge Sort Program
void mergesort(int a[], int lb, int ub) }
{ else
int mid; {
if(lb<ub) b[k]=a[j];
{ j++;
mid=(lb+ub)/2; k++;
mergesort(a,lb,mid); }
mergesort(a, mid+1,ub); }
merge(a,lb,mid,ub); while(i<=mid)
} {
} b[k]=a[i];
void merge(int a[], int lb, int mid, int ub) i++;
{ k++;
int i,j,k; }
int b[20]; while(j<=ub)
{
i=lb; b[k]=a[j];
j=mid+1; k++;
k=lb; j++;
}
while(i<=mid && j<=ub) for(k=0;k<=ub;k++)
{ {
if(a[i]<=a[j]) a[k]=b[k];
{ }
b[k]=a[i]; }
k++;
i++;
Sequential Search
Two popular search methods are Linear Search and Binary Search. So, here we will discuss
the popular searching technique, i.e., Linear Search Algorithm.

Linear search is also called as sequential search algorithm. It is the simplest searching
algorithm. In Linear search, we simply traverse the list completely and match each element of
the list with the item whose location is to be found. If the match is found, then the location of
the item is returned; otherwise, the algorithm returns NULL.

It is widely used to search an element from the unordered list, i.e., the list in which items are
not sorted. The worst-case time complexity of linear search is O(n).
The steps used in the implementation of Linear Search are listed as follows -
 First, we have to traverse the array elements using a for loop.
 In each iteration of for loop, compare the search element with the current array element,
and –
 If the element matches, then return the index of the corresponding array element.
 If the element does not match, then move to the next element.
 If there is no match or the search element is not present in the given array, return -1.
 Now, let's see the algorithm of linear search.
#include <stdio.h>
int linearSearch(int a[], int n, int val) {
// Going through array sequencially
for (int i = 0; i < n; i++)
{
if (a[i] == val)
return i+1;
}
return -1;
}
int main() {
int a[] = {70, 40, 30, 11, 57, 41, 25, 14, 52}; // given array
int val = 41; // value to be searched
int n = sizeof(a) / sizeof(a[0]); // size of array
int res = linearSearch(a, n, val); // Store result
printf("The elements of the array are - ");
for (int i = 0; i < n; i++)
printf("%d ", a[i]);
printf("\nElement to be searched is - %d", val);
if (res == -1)
printf("\nElement is not present in the array");
else
printf("\nElement is present at %d position of array", res);
return 0;
}
Binary Search
Binary search is the search technique that works efficiently on sorted lists. Hence, to
search an element into some list using the binary search technique, we must ensure that the
list is sorted.
Binary search follows the divide and conquer approach in which the list is divided into
two halves, and the item is compared with the middle element of the list. If the match is
found then, the location of the middle element is returned. Otherwise, we search into either
of the halves depending upon the result produced through the match.
NOTE: Binary search can be implemented on sorted array elements. If the list elements are
not arranged in a sorted manner, we have first to sort them.

Time Complexity
 Best Case Complexity - In Binary search, best case occurs when the element to search is
found in first comparison, i.e., when the first middle element itself is the element to be
searched. The best-case time complexity of Binary search is O(1).
 Average Case Complexity - The average case time complexity of Binary search is
O(logn).
 Worst Case Complexity - In Binary search, the worst case occurs, when we have to keep
reducing the search space till it has only one element. The worst-case time complexity of
Binary search is O(logn).
int main() }
{ int binarysearch(int a[], int first, int last, int
int a[20], int val,n,res; val)
printf(“enter how many elements u want to {
insert - "); int mid;
Scanf(“%d’,&n); while(first<=last)
for (int i = 0; i < n; i++) {
{ mid=(first+last)/2;
scanf("%d ", &a[i]); if(val==a[mid])
} return mid;
printf("\nElement to be searched is - "); else if( val < a[mid])
scanf(“%d”,&val); last=mid-1;
res = binarysearch(a,0,n,val); else
if (res == -1) first=mid+1;
printf("\nElement is not present in the }
array"); return(-1);
else }
printf("\nElement is present at %d position
of array", res);
return 0;
Skip list
What is a skip list?
 A skip list is a probabilistic data structure. The skip list is used to store a sorted list of
elements or data with a linked list. It allows the process of the elements or data to view
efficiently. In one single step, it skips several elements of the entire list, which is why it
is known as a skip list.
 The skip list is an extended version of the linked list. It allows the user to search,
remove, and insert the element very quickly. It consists of a base list that includes a set
of elements which maintains the link hierarchy of the subsequent elements.
Skip list structure
 It is built in two layers: The lowest layer and Top layer.
 The lowest layer of the skip list is a common sorted linked list, and the top layers of the
skip list are like an "express line" where the elements are skipped.
Complexity of Skip List

You might also like