MERGE SORT
Another application of Divide and conquer is merge sort.
Given a sequence of n elements a[1],…,a[n] the general idea is to imagine
then split into 2 sets a[1],…..,a[n/2] and a[[n/2]+1],….a[n].
Each set is individually sorted, and the resulting sorted sequences are
merged to produce a single sorted sequence of n elements.
Thus, we have another ideal example of the divide-and-conquer strategy in
which the splitting is into 2 equal-sized sets & the combining operation is
the merging of 2 sorted sets into one.
Mergesort
Split array A[0..n-1] in two about equal halves and make copies of each half in
arrays B and C
Sort arrays B and C recursively
Merge sorted arrays B and C into array A as follows:
i)Repeat the following until no elements remain in one of the arrays:
- compare the first elements in the remaining unprocessed
portions of the arrays
- copy the smaller of the two into A, while incrementing the
index indicating the unprocessed portion of that array
ii) Once all elements in one of the arrays are processed, copy the
remaining unprocessed elements from the other array into A.
9/25/2024
ALGORITHM FOR MERGE SORT
Algorithm MergeSort(low,high)
//a[low:high] is a global array to be sorted
//Small(P) is true if there is only one element
//to sort. In this case the list is already sorted.
{
if (low<high) then //if there are more than one element
{
//Divide P into subproblems
//find where to split the set
mid = [(low+high)/2];
//solve the subproblems.
Mergesort (low,mid);
Mergesort(mid+1,high);
//combine the solutions .
Merge(low,mid,high);
}
9/25/2024
9/25/2024
Tree of Calls of MergeSort (1,10)
9/25/2024
Time Anlysis (MergeSort)
The computing time for merge sort is described by the recurrence relation:
Where n is a power of 2, i.e. n = 2k .
Let a = 2, b = 2, T(1)= a and f(n) = cn
Now by successive substitutions:
T(n) = 2(2T(n/4) + cn/2) + cn
= 4T(n/4) + 2cn
= 4(2T(n/8) + cn/4) + 2Cn
----------
= 2k T(1) + kcn where n = 2k
= an + cn log n where k = log2 n
It is easy to see that if 2k < n <= 2k+1, then T(n) <= T(2k+1).
Therefore, T(n) = O(n log n)
9/25/2024
QUICK SORT
Select: pick an arbitrary element x in
S to be the pivot.
Partition: rearrange elements so that
elements with value less than x go to
List L to the left of x and elements
with value greater than x go to the
List R to the right of x.
Recursion: recursively sort the lists L
and R.
QUICK SORT
In Quick sort, the division into 2 sub arrays is made so that the sorted sub
arrays do not need to be merged later.
This is accomplished by rearranging the elements in a[1:n] such that
a[i]<=a[j] for all i between 1 & n and all j between (m+1) & n for some m,
1<=m<=n.
Thus the elements in a[1:m] & a[m+1:n] can be independently sorted.
No merge is needed. This rearranging is referred to as partitioning.
1. Algorithm: Partition the array a[m:p-1] about a[m]
2. Algorithm Partition(a,m,p)
3. /*within a[m],a[m+1],…..,a[p-1] the elements are rearranged in such a manner
that if initially t=a[m],then after completion a[q]=t for some q between m and
4. p-1,a[k]<=t for m<=k<q, and a[k]>=t for q<k<p. q is returned Set a[p]=infinite. */
5. {
v=a[m]; i=m; j=p;
1. repeat
1. {
6. repeat
7. i = i+1;
8. until(a[i]>=v);
9. repeat
10. j = j-1;
11. until(a[j]<=v);
12. if (i<j) then interchange(a,i.j);
13. }until(i>=j);
14. a[m] =a [j]; a[j]= v;
15. retun j;
16. }
17. Algorithm Interchange(a,I,j) //Exchange a[I] with a[j]
18. {
19. p = a[i];
20. a[i] = a[j];
21. a[j] = p;
22. }
Algorithm: Sorting by Partitioning
Algorithm Quicksort(p,q)
//Sort the elements a[p],….a[q] which resides
//is the global array a[1:n] into ascending
//order; a[n+1] is considered to be defined
// and must be >= all the elements in a[1:n]
{
if(p<q) then // If there are more than one element
{
// divide p into 2 subproblems
j=partition(a,p,q+1);
//‟j‟ is the position of the partitioning element.
//solve the subproblems.
quicksort(p,j-1);
quicksort(j+1,q);
//There is no need for combining solution.
}
}
Quick Sort Example
(i.e., partitioned into [60 45 50 55] 65 [85 80 75 70])
Another example !
[15 22 13 27 12 10 20 25]
• Why + ∞ at a[n+1]?
• Explain using the example
1 2 3 4 5 6 7 8 9 10
[65 12 15 11 16 20 23 17 21 +∞ ]
Best case analysis of quick sort
Best case: all splits happen in the middle — O(n log n)
TB (n) = 2Tbest(n/2) + n for n>1, TB(1)=0
T(n) = 2T( n/2) + n
= 2[2T( n/4 )+ n/2] + n
= 4T( n/4 )+ 2n
= 4[2T( n/8 )+ n/4] + 2cn
= 8T( n/8 )+ 3n
after k substitutions
= 2kT( n/2k )+ kn
= nT(1)+ kn where n = 2k
= n log n where k = log2 n
T(n) = O(n log n)
9/25/2024
Worst Case Analysis of Quicksort
Worst case: sorted array! — O(n2)
CW(n) = (n+1) + n + … + 2
= (n+1)(n+2)/2
Cworst(n) = O(n2)
9/25/2024
Average Case Analysis of Quicksort
Average case: random arrays —O(n log n)
# CA(n) is much less than CW(n)
# Under the assumption that partitioning element v has
an equal probability of being the ith smallest element
# CA (n) = O (n log n)
9/25/2024
Comparison between Mergesort and QuickSort
Experiments •
# Comparison between QuickSort and MergeSort on
Sun 10/30
#Data made by random integer generator
#Average-case on 50 random inputs
Fig: Average computing time for two sorting algorithms
9/25/2024
Thank You