Lecture
Sorting Algorithms
Sorting Techniques
Introduction
A Sorting Algorithm is used to rearrange a given array or list of elements
according to a comparison operator on the elements.
The comparison operator is used to decide the new order of elements in
the respective data structure.
Sorting techniques in data structures is a process of rearranging data elements in
an array or list in order to make it easier to search and retrieve
By sorting in data structure, the complexity of searching for a particular item is
reduced
Introduction (Cont’d)
For instance, searching the entire list would take too long if you have an
unsorted list of 10 items.
However, searching for an item would be much faster if the same list is
sorted.
Sorting Techniques
Several sorting techniques in data structure can be used to sort data elements in
an array or list.
The most common types of sorting in data structure are
• insertion sort
• selection sort
• bubble sort
• quick sort
• merge sort.
Sorting - Bubble Sort
• Bubble Sort is the simplest sorting algorithm that works by repeatedly
swapping the adjacent elements if they are in the wrong order.
• In Bubble Sort algorithm,
1. traverse from left and compare adjacent elements and the
higher one is placed at right side.
2. In this way, the largest element is moved to the rightmost end
at first.
3. This process is then continued to find the second largest and
Sorting- Bubble Sort
• It follows that the smaller elements move toward the top (beginning), and the
larger elements move toward the bottom (end) of the list.
• In the first iteration, we consider list[0]...list[n-1]; in the second iteration, we
consider list[0]...list[n - 2]; in the third iteration, we consider list[0]...list[n-3], and
so on. For example, consider list[0]............list[4], as shown in Figure
Sorting- Bubble Sort
Step 2: Repeat Step 1 with one less comparison; that is, now we stop after we
compare and possibly rearrange A [n-2] and A [n-1]. When Step 2 is completed,
the second largest element will occupy A [n-1].
Step 3: Repeat step 1 with two fewer comparisons; that is, we stop after we
compare and possibly rearrange A [n-3] and A [n-2].
……………………………………………………………………………………
……………………………………………………………………………………
Step n-1: Compare A [1] with A [2] and arrange them so that A [1] < A [2].
• After n-1 steps, the list will be sorted in the increasing order.
Total no. of passes: n-1
Total no. of comparisons: n*(n-1)/2
Sorting- Bubble Sort
Time Complexity: O(N2)
Advantages of Bubble Sort:
• Bubble sort is easy to understand and implement.
• It does not require any additional memory space.
• It is a stable sorting algorithm, meaning that elements with the same key value maintain their
relative order in the sorted output.
Disadvantages of Bubble Sort:
• Bubble sort has a time complexity of O(N2) which makes it very slow for large data sets.
• Bubble sort is a comparison-based sorting algorithm, which means that it requires a comparison
operator to determine the relative order of elements in the input data set.
• It can limit the efficiency of the algorithm in certain cases.
Sorting- Selection Sort
Selection sort is a simple and efficient sorting algorithm that works by
repeatedly selecting the smallest (or largest) element from the unsorted
portion of the list and moving it to the sorted portion of the list.
The algorithm repeatedly selects the smallest (or largest) element from the
unsorted portion of the list and swaps it with the first element of the unsorted part.
This process is repeated for the remaining unsorted portion until the entire list is
sorted.
Sorting- Selection Sort
Selection sort is a simple and efficient sorting algorithm that works by
repeatedly selecting the smallest (or largest) element from the unsorted
portion of the list and moving it to the sorted portion of the list.
The algorithm repeatedly selects the smallest (or largest) element from the
unsorted portion of the list and swaps it with the first element of the unsorted part.
This process is repeated for the remaining unsorted portion until the entire list is
sorted.
Sorting- Selection Sort
Lets consider the following array as an example: arr[] = {64, 25, 12, 22, 11}
First pass:
• For the first position in the sorted array, the whole array is traversed from index 0 to 4
sequentially. The first position where 64 is stored presently, after traversing whole array it is
clear that 11 is the lowest value.
• Thus, replace 64 with 11. After one iteration 11, which happens to be the least value in the
array, tends to appear in the first position of the sorted list.
Sorting- Selection Sort
Second Pass:
For the second position, where 25 is present, again traverse the rest of the array in a sequential
manner.
After traversing, we found that 12 is the second lowest value in the array and it should appear at
the second place in the array, thus swap these values.
Sorting- Selection Sort
Third Pass:
Now, for third place, where 25 is present again traverse the rest of the array and find the third
least value present in the array.
While traversing, 22 came out to be the third least value and it should appear at the third place in
the array, thus swap 22 with element present at third position.
Sorting- Selection Sort
Fourth pass:
Similarly, for fourth position traverse the rest of the array and find the fourth least element in the
array
As 25 is the 4th lowest value hence, it will place at the fourth position.
Sorting- Selection Sort
Fifth Pass:
At last the largest value present in the array automatically get placed at the last position in the
array
The resulted array is the sorted array.
Sorting- Selection Sort
Algorithm: Selection-Sort (A)
1. For i← 1 to n-1 do
min j ←i;
min x ← A[i]
2. For j ←i + 1 to n do
if A[j] < min x then
min x ← A[j]
A[min j] ← A [i]
A[i] ← min x
Sorting- Selection Sort
The time complexity of Selection Sort is O(N2) as there are two nested loops:
• One loop to select an element of Array one by one = O(N)
• Another loop to compare that element with every other Array element = O(N)
• Therefore overall complexity = O(N) * O(N) = O(N*N) = O(N2)
Sorting- Selection Sort
Advantages of Selection Sort Algorithm
Simple and easy to understand.
Works well with small datasets.
Disadvantages of the Selection Sort Algorithm
Selection sort has a time complexity of O(n^2) in the worst and average case.
Does not work well on large datasets.
Does not preserve the relative order of items with equal keys which means it is not
stable
Sorting- Insertion Sort
• Insertion sort is a simple sorting algorithm that works similarly to
the way you sort playing cards in your hands.
• The array is virtually split into a sorted and an unsorted part.
• Values from the unsorted part are picked and placed in the correct
position in the sorted part.
How Does it Work
• To sort an array of size N in ascending order iterate over the array
• Compare the current element (key) to its predecessor
• if the key element is smaller than its predecessor, compare it to the
elements before.
• Move the greater elements one position up to make space for the
swapped element.
Sorting- Insertion Sort
Consider an example: arr[]: {12, 11, 13, 5, 6}
First Pass:
12 11 13 5 6
Initially, the first two elements of the array are compared in insertion sort.
12 11 13 5 6
Here, 12 is greater than 11 hence they are not in the ascending order and 12 is not at its correct position.
Thus, swap 11 and 12.
So, for now 11 is stored in a sorted sub-array.
11 12 13 5 6
Sorting- Insertion Sort
Second Pass:
Now, move to the next two elements and compare them
11 12 13 5 6
Here, 13 is greater than 12, thus both elements seems to be in ascending order, hence, no swapping will
occur. 12 also stored in a sorted sub-array along with 11
Sorting- Insertion Sort
Now, two elements are present in the sorted sub-array which are 11 and 12
Moving forward to the next two elements which are 13 and 5
11 12 13 5 6
Both 5 and 13 are not present at their correct place so swap them
11 12 5 13 6
After swapping, elements 12 and 5 are not sorted, thus swap again
11 5 12 13 6
Here, again 11 and 5 are not sorted, hence swap again
5 11 12 13 6
Here, 5 is at its correct position
Sorting- Insertion Sort
Fourth Pass:
Now, the elements which are present in the sorted sub-array are 5, 11 and 12
Moving to the next two elements 13 and 6
5 11 12 13 6
Clearly, they are not sorted, thus perform swap between both
5 11 12 6 13
Now, 6 is smaller than 12, hence, swap again
5 11 6 12 13
Here, also swapping makes 11 and 6 unsorted hence, swap again
5 6 11 12 13
Finally, the array is completely sorted.
Sorting- Insertion Sort
Time Complexity of Insertion Sort
• The worst-case time complexity of the Insertion sort is O(N^2)
• The average case time complexity of the Insertion sort is O(N^2)
• The time complexity of the best case is O(N).
Sorting- Insertion Sort
This algorithm is one of the simplest algorithms with a simple implementation.
Basically, Insertion sort is efficient for small data values Insertion sort is adaptiv
in nature, i.e. it is appropriate for data sets that are already partially sorted.
Sorting- Quick Sort
QuickSort is a sorting algorithm based on Divide and Conquer algorithm.
It picks an element as a pivot and partitions the given array around the picked
pivot by placing the pivot in its correct position in the sorted array.
This reduces the size of each sub-set and makes it easier to find the next pivot
element.
The process is repeated until all elements are sorted.
Sorting- Quick Sort
The key process in quickSort is a partition().
The target of partitions is to place the pivot (any element can be chosen to
be a pivot) at its correct position in the sorted array and put all smaller
elements to the left of the pivot, and all greater elements to the right of the
pivot.
Partition is done recursively on each side of the pivot after the pivot is
placed in its correct position and this finally sorts the array.
Sorting- Quick Sort
The logic is simple, we start from the leftmost element and keep
track of the index of smaller (or equal) elements as i.
While traversing, if we find a smaller element, we swap the
current element with arr[i].
Otherwise, we ignore the current element.
Sorting- Quick Sort
There are many different choices for picking pivots:
• Always pick the first element as a pivot
• Always pick the last element as a pivot (implemented below)
• Pick a random element as a pivot
• Pick the middle as the pivot.
Sorting- Quick Sort
Consider: arr[] = {10, 80, 30, 90, 40}.
Compare 10 with the pivot and as it is less than pivot arrange it accordingly.
Sorting- Quick Sort
Compare 80 with the pivot. It is greater than pivot.
Sorting- Quick Sort
Compare 30 with pivot. It is less than pivot so arrange it accordingly.
Sorting- Quick Sort
Compare 90 with the pivot. It is greater than the pivot.
Sorting- Quick Sort
Arrange the pivot in its correct position.
Sorting- Quick Sort
Advantages of Quick Sort:
• It is a divide-and-conquer algorithm that makes it easier to solve problems.
• It is efficient on large data sets.
• It has a low overhead, as it only requires a small amount of memory to function.
Disadvantages of Quick Sort:
• It has a worst-case time complexity of O(N2), occurs when the pivot is chosen poorly.
• It is not a good choice for small data sets.
• It is not a stable sort, meaning that if two elements have the same key, their relative order will
not be preserved in the sorted output.
Sorting- Quick Sort
There are many different choices for picking pivots:
• Always pick the first element as a pivot
• Always pick the last element as a pivot
• Pick a random element as a pivot
• Pick the middle as the pivot.