Zaid bin shafi 19bcs9504
CASE STUDY
DESIGN AND ANALYSIS OF ALGORITHMS
Name :Zaid bin Shafi section:cse-20
Uid:19BCS9504
Analysis of different sorting techniques
Comparison based sorting –
In comparison based sorting, elements of an array are compared with each other to find the
sorted array.
Bubble sort and Insertion sort –
Average and worst case time complexity: n^2
Best case time complexity: n when array is already sorted.
Worst case: when the array is reverse sorted.
Selection sort –
Best, average and worst case time complexity: n^2 which is independent of distribution of
data.
Merge sort –
Best, average and worst case time complexity: nlogn which is independent of distribution
of data.
Heap sort –
Best, average and worst case time complexity: nlogn which is independent of
distribution of data.
Quick sort –
It is a divide and conquer approach with recurrence relation:
T(n) = T(k) + T(n-k-1) + cn
Worst case: when the array is sorted or reverse sorted, the partition algorithm divides the
array in two subarrays with 0 and n-1 elements. Therefore,
T(n) = T(0) + T(n-1) + cn
Zaid bin shafi 19bcs9504
Solving this we get, T(n) = O(n^2)
Best case and Average case: On an average, the partition algorithm divides the array in two
subarrays with equal size. Therefore,
T(n) = 2T(n/2) + cn
Solving this we get, T(n) = O(nlogn)
Non-comparison based sorting –
In non-comparison based sorting, elements of array are not compared with each other to find the
sorted array.
Radix sort –
Best, average and worst case time complexity: nk where k is the maximum number of
digits in elements of array.
Count sort –
Best, average and worst case time complexity: n+k where k is the size of count array.
Bucket sort –
Best and average time complexity: n+k where k is the number of buckets.
Worst case time complexity: n^2 if all elements belong to same bucket.
In-place/Outplace technique –
A sorting technique is inplace if it does not use any extra memory to sort the array.
Among the comparison based techniques discussed, only merge sort is outplaced technique as it
requires an extra array to merge the sorted subarrays.
Among the non-comparison based techniques discussed, all are outplaced techniques. Counting
sort uses a counting array and bucket sort uses a hash table for sorting the array.
Online/Offline technique –
A sorting technique is considered Online if it can accept new data while the procedure is
ongoing i.e. complete data is not required to start the sorting operation.
Among the comparison based techniques discussed, only Insertion Sort qualifies for this
because of the underlying algorithm it uses i.e. it processes the array (not just elements) from
left to right and if new elements are added to the right, it doesn’t impact the ongoing operation.
Stable/Unstable technique –
A sorting technique is stable if it does not change the order of elements with the same value.
Out of comparison based techniques, bubble sort, insertion sort and merge sort are stable
techniques. Selection sort is unstable as it may change the order of elements with the same
value. For example, consider the array 4, 4, 1, 3.
In the first iteration, the minimum element found is 1 and it is swapped with 4 at 0th position.
Therefore, the order of 4 with respect to 4 at the 1st position will change. Similarly, quick sort
and heap sort are also unstable.
Zaid bin shafi 19bcs9504
Out of non-comparison based techniques, Counting sort and Bucket sort are stable sorting
techniques whereas radix sort stability depends on the underlying algorithm used for sorting.
Analysis of sorting techniques
When the array is almost sorted, insertion sort can be preferred.
When order of input is not known, merge sort is preferred as it has worst case time
complexity of nlogn and it is stable as well.
When the array is sorted, insertion and bubble sort gives complexity of n but quick sort
gives complexity of n^2.
Time Complexities of all Sorting Algorithms
Algorithm Time Complexity
Best Average Worst
Selection Sort Ω(n^2) θ(n^2) O(n^2)
Bubble Sort Ω(n) θ(n^2) O(n^2)
Zaid bin shafi 19bcs9504
Algorithm Time Complexity
Best Average Worst
Insertion Sort Ω(n) θ(n^2) O(n^2)
Heap Sort Ω(n log(n)) θ(n log(n)) O(n log(n))
Quick Sort Ω(n log(n)) θ(n log(n)) O(n^2)
Merge Sort Ω(n log(n)) θ(n log(n)) O(n log(n))
Bucket Sort Ω(n+k) θ(n+k) O(n^2)
Radix Sort Ω(nk) θ(nk) O(nk)
Data Time Time Time Space
Algorith
structu complexity:B complexity:Aver complexity:Wo complexity:Wo
m
re est age rst rst
Quick
Array O(n log(n)) O(n log(n)) O(n2) O(n)
sort
Merge
Array O(n log(n)) O(n log(n)) O(n log(n)) O(n)
sort
Heap sort Array O(n log(n)) O(n log(n)) O(n log(n)) O(1)
Zaid bin shafi 19bcs9504
Smooth
Array O(n) O(n log(n)) O(n log(n)) O(1)
sort
Bubble Array O(n) O(n2) O(n2) O(1)
sort
Insertion
Array O(n) O(n2) O(n2) O(1)
sort
Selection
Array O(n2) O(n2) O(n2) O(1)
sort
Bogo
Array O(n) O(n n!) O(∞) O(1)
sort
Insertion sort applied to a list of n elements, assumed to be all different and initially in
random order. On average, half the elements in a list A1 ... Aj are less than elementAj+1, and
half are greater. Therefore, the algorithm compares the (j + 1)th element to be inserted on
the average with half the already sorted sub-list, so tj = j/2. Working out the resulting
average-case running time yields a quadratic function of the input size, just like the worst-
case running time.
Quicksort applied to a list of n elements, again assumed to be all different and initially in
random order. This popular sorting algorithm has an average-case performance of
O(n log(n)), which contributes to making it a very fast algorithm in practice. But given a
worst-case input, its performance degrades to O(n2). Also, when implemented with the
"shortest first" policy, the worst-case space complexity is instead bounded by O(log(n)).
Heapsort has O(n) time when all elements are the same. Heapify takes O(n) time and then
removing elements from the heap is O(1) time for each of the n elements. The run time
grows to O(nlog(n)) if all elements must be distinct.
Bogosort has O(n) time when the elements are sorted on the first iteration. In each
iteration all elements are checked if in order. There are n! possible permutations; with a
balanced random number generator, almost each permutation of the array is yielded in n!
Zaid bin shafi 19bcs9504
iterations. Computers have limited memory, so the generated numbers cycle; it might not be
possible to reach each permutation. In the worst case this leads to O(∞) time, an infinite
loop.