0% found this document useful (0 votes)
4 views

The Efficiency of Sorting Algorithms Complexity

Dsa

Uploaded by

gsndharavkatyal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

The Efficiency of Sorting Algorithms Complexity

Dsa

Uploaded by

gsndharavkatyal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

The Efficiency of

Sorting
Algorithms
Comparing Speed and Complexity

+91-7260058093
www.algotutor.io
1.Bubble sort
Bubble sort is one of the simplest sorting algorithms, often
used to teach the concept of sorting. Although it is not an
efficient algorithm for large data sets, it can be effective
for small lists. Bubble sort has a time complexity of
O(n^2), meaning that its time to complete increases
exponentially as the size of the list grows.
2.Heap sort
Heap sort is a comparison-based sorting algorithm that
works by building a binary heap from the list of elements
to be sorted. It then repeatedly extracts the maximum
element from the heap, placing it at the end of the sorted
array, and maintaining the heap property for the
remaining elements. Heap sort has a time complexity of
O(n log n), making it an efficient algorithm for large data
sets.
3.Insertion sort
Insertion sort is a simple sorting algorithm that works by
iteratively building a sorted array, one element at a time. It
begins by considering the first element in the array to be the
sorted portion, and the remaining elements to be the
unsorted portion. The algorithm then iterates through the
unsorted portion, removing each element and inserting it
into its correct position in the sorted portion. Insertion sort
has a time complexity of O(n^2), making it relatively
inefficient for large data sets, but it is efficient for small data
sets and has low overhead.
4.Merge sort
Merge sort is a divide-and-conquer
sorting algorithm that works by
recursively dividing a list into
smaller sublists, sorting the sublists,
and then merging them back
together. The algorithm first divides
the list into two halves, and then
recursively divides each half until
only individual elements remain. It
then merges the individual
elements back into sorted pairs,
and then into larger and larger
sorted sublists until the entire list is
sorted. Merge sort has a time
complexity of O(n log n), making it
an efficient algorithm for large data
sets.
5.Quick sort
Quick sort is a highly efficient sorting algorithm that uses
a divide-and-conquer approach to sort a list of elements.
It works by selecting a "pivot" element from the list and
partitioning the other elements into two sub-lists,
according to whether they are less than or greater than
the pivot element. The sub-lists are then recursively
sorted using the same process until the entire list is
sorted. Quick sort has a time complexity of O(n log n) in
the average case and O(n^2) in the worst case, although
the latter is rare.
6.Selection sort
Selection sort is a simple sorting algorithm that works by
repeatedly finding the minimum element from an
unsorted portion of the list and swapping it with the first
unsorted element. The algorithm divides the list into two
portions: a sorted portion that is built up from left to right,
and an unsorted portion that contains the remaining
elements. Selection sort has a time complexity of O(n^2),
making it relatively inefficient for large data sets, but it is
simple to understand and easy to implement.
7.Shell sort
Shell sort is a variation of insertion sort that works by sorting
elements that are distant from each other first and then
gradually reducing the distance between elements to be
sorted. It works by dividing the list into sub-lists of elements
that are a fixed increment apart and sorting each sub-list
using insertion sort. The increment is then reduced and the
process is repeated until the increment becomes 1 and the
entire list is sorted. Shell sort has a time complexity that
depends on the gap sequence used to determine the
increment between elements, but it typically performs better
than insertion sort and selection sort for larger data sets.
8.Time Complexities
Time complexity is a measure of the computational
resources required to solve a problem, expressed as a
function of the input size. In the context of sorting algorithms,
time complexity is used to describe how the running time of
an algorithm grows as the size of the input array increases.

You might also like