In-place Sorting and Not-in-place Sorting
Sorting algorithms may require some extra space for comparison and temporary storage of few data elements.
These algorithms do not require any extra space and sorting is said to happen in-place, or for example, within
the array itself. This is called in-place sorting. Bubble sort is an example of in-place sorting.
However, in some sorting algorithms, the program requires space which is more than or equal to the elements
being sorted. Sorting which uses equal or more space is called not-in-place sorting. Merge-sort is an example
of not-in-place sorting.
Stable and Not Stable Sorting
If a sorting algorithm, after sorting the contents, does not change the sequence of similar content in which they
appear, it is called stable sorting.
If a sorting algorithm, after sorting the contents, changes the sequence of similar content in which they appear,
it is called unstable sorting.
Adaptive and Non-Adaptive Sorting Algorithm
A sorting algorithm is said to be adaptive, if it takes advantage of already 'sorted' elements in the list that is to
be sorted. That is, while sorting if the source list has some element already sorted, adaptive algorithms will take
this into account and will try not to re-order them.
A non-adaptive algorithm is one which does not take into account the elements which are already sorted. They
try to force every single element to be re-ordered to confirm their sorted ness
Sorting Techniques in Data Structures
Introduction:
Sorting is a fundamental computer science operation that entails putting a group of objects in a specific order. It
is extensively utilised in many different applications, including database management, data analysis, and
searching. In data structures, different sorting techniques are employed to organize and manipulate large sets of
data efficiently.
Sorting Techniques:
There are various sorting techniques, which are listed in the table below:
Applications, Advantages and Disadvantages of Sorting Algorithm
Sorting algorithms are used to arrange a list of elements in a specific order, such
as ascending or descending or any other user specified order like sorting strings by lengths.
Applications of Sorting Algorithms:
Quickly Finding k-th Smallest or K-th Largest : Once we sort the array, we can find k-th smallest and
k-th largest elements in O(1) time for different values of k.
Searching Algorithms: Sorting is often a crucial step in search algorithms like binary search, Ternary
Search, where the data needs to be sorted before searching for a specific element.
Data management: Sorting data makes it easier to search, retrieve, and analyze.
Database optimization: Sorting data in databases improves query performance. We typically keep the
data sorted by primary index so that we can do quick queries.
Machine learning: Sorting is used to prepare data for training machine learning models.
Data Analysis: Sorting helps in identifying patterns, trends, and outliers in datasets. It plays a vital role in
statistical analysis, financial modeling, and other data-driven fields.
Operating Systems: Sorting algorithms are used in operating systems for tasks like task scheduling,
memory management, and file system organization.
Advantages of Sorting Algorithms:
Efficiency: Sorting algorithms help in arranging data in a specific order, making it easier and faster to
search, retrieve, and analyze information.
Improved Performance: By organizing data in a sorted manner, algorithms can perform operations more
efficiently, leading to improved performance in various applications.
Simplified data analysis: Sorting makes it easier to identify patterns and trends in data.
Reduced memory consumption: Sorting can help reduce memory usage by eliminating duplicate
elements.
Improved data visualization: Sorted data can be visualized more effectively in charts and graphs.
Disadvantages of Sorting Algorithms:
Insertion: If we wish to keep data sorted, then insertion operation becomes costly as we have to maintain
sorted order. If we do not have to maintain sorted order, we can simply insert at the end.
Algorithm selection: Choosing the most appropriate sorting algorithm for a given dataset can be
challenging.
For a lot of problems hashing works better than sorting, for example, finding distinct elements, finding a
pair with given sum.
Time Complexities of all Sorting Algorithms
The efficiency of an algorithm depends on two parameters:
1. Time Complexity
2. Space Complexity
Time Complexity:
Time Complexity is defined as the number of times a particular instruction set is executed rather than the
total time taken. It is because the total time taken also depends on some external factors like the compiler
used, the processor’s speed, etc.
Space Complexity:
Space Complexity is the total memory space required by the program for its execution.
Both are calculated as the function of input size(n). One important thing here is that despite these parameters,
the efficiency of an algorithm also depends upon the nature and size of the input.
Types of Time Complexity :
1. Best Time Complexity: Define the input for which the algorithm takes less time or minimum time. In the
best case calculate the lower bound of an algorithm. Example: In the linear search when search data is
present at the first location of large data then the best case occurs.
2. Average Time Complexity: In the average case take all random inputs and calculate the computation
time for all inputs.
And then we divide it by the total number of inputs.
3. Worst Time Complexity: Define the input for which algorithm takes a long time or maximum time. In
the worst calculate the upper bound of an algorithm. Example: In the linear search when search data is
present at the last location of large data then the worst case occurs.
Algorithm Time Complexity Space Complexity
Best Average Worst Worst
Selection Sort O(n2) O(n2) O(n2) O(1)
Bubble Sort O(n) O(n2) O(n2) O(1)
Insertion Sort O(n) O(n2) O(n2) O(1)
Heap Sort O(n log(n)) O(n log(n)) O(n log(n)) O(1)
Quick Sort O(n log(n)) O(n log(n)) O(n2) O(n)
Merge Sort O(n log(n)) O(n log(n)) O(n log(n)) O(n)
Bucket Sort O(n +k) O(n +k) O(n2) O(n)
Radix Sort O(nk) O(nk) O(nk) O(n + k)
Count Sort O(n +k) O(n +k) O(n +k) O(k)
Shell Sort O(n log(n)) O(n log(n)) O(n2) O(1)
Tim Sort O(n) O(n log(n)) O(n log (n)) O(n)
Algorithm Time Complexity Space Complexity
Best Average Worst Worst
Tree Sort O(n log(n)) O(n log(n)) O(n2) O(n)
Cube Sort O(n) O(n log(n)) O(n log(n)) O(n)