0% found this document useful (0 votes)
5 views27 pages

Class Lecture 6 Quick-Sort

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views27 pages

Class Lecture 6 Quick-Sort

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 27

KCS 503: Design and

Analysis of Algorithms

Quick Sort
Sorting in linear time
Shell sort
B N Pandey 7/19/2020
B N Pandey 7/19/2020
B N Pandey 7/19/2020
Quicksort is also a divide-and-conquer algorithm. An
unsorted array A taken in which p and r is the lower
bound and upper bound of the elements respectively.
• Divide:The array A[p..r] is partitioned into two non-
empty subarrays A[p..q] and A[q+1..r]. Invariant: All
elements in A[p..q] are less than all elements in
A[q+1..r].
• Conquer: The subarrays are recursively sorted by calls
to quicksort.
• Combine: Unlike merge sort, no combining step: two
subarrays form an already-sorted array.

B N Pandey 7/19/2020
Quick sort Algorithm
QUICKSORT(A, p, r)
1.if (p < r)
2. q = PARTITION(A, p, r)
3. QUICKSORT(A, p, q-1)
4. QUICKSORT(A, q+1, r)
To sort an entire array A, the initial call is
QUICKSORT (A, 1, A.length).

B N Pandey 7/19/2020
Partitioning the array
PARTITION(A, p, r)
1 x ← A[r]
2i←p-1
3 for j ← p to r - 1
4 do if A[j] ≤ x
5 then i ← i + 1
6 exchange A[i] ↔ A[j]
7 exchange A[i + 1] ↔ A[r]
8 return i + 1

B N Pandey 7/19/2020
B N Pandey 7/19/2020
Performance of quicksort

The running time of quicksort depends on


whether the partitioning is balanced or
unbalanced, which in turn depends on which
elements are used for partitioning.

B N Pandey 7/19/2020
Worst Case Partitioning
The worst-case behavior for quicksort occurs
when the partitioning routine produces one
subproblem with n - 1 elements and one with 0
elements. The recurrence for the running time
is
T(n)=T(n-1)+T(0) + (n)
= (n2)
i.e T(n) = (n2)

B N Pandey 7/19/2020
Best case partitioning
In the best case it partition the array into equal
sub arrays. The recurrence for balanced
portioning is
T(n) = 2T(n/2) + (n)
= (n lg n)
i.e. T(n) = (n lg n)

B N Pandey 7/19/2020
Sorting in linear time
• The algorithms that can sort n numbers in O(n lg n)
time is Merge sort and heap sort and achieve this
upper bound in the worst case;
• Quick sort achieves it on average.
• These algorithms share an interesting property: the
sorted order they determine is based only on
comparisons between the input elements.
• We call such sorting algorithms comparison sorts. All
the sorting algorithms introduced thus far are
comparison sorts

B N Pandey 7/19/2020
• We shall prove that any comparison sort must
make Θ(n lg n) comparisons in the worst case
to sort n elements. Thus, merge sort and heap
sort are asymptotically optimal, and no
comparison sort exists that is faster by more
than a constant factor.
• Here three sorting algorithms-counting sort,
radix sort, and bucket sort-that run in linear
time. These algorithms use operations other
than comparisons to determine the sorted
order. Consequently, the Θ(n lg n) lower
bound does not apply to them.
B N Pandey 7/19/2020
Counting sort
• Counting sort assumes that each of the n input
elements is an integer in the range 0 to k, for
some integer k. When k = O(n), the sort runs
in ‚ϴ (n) time.
• The algorithm:
 Input: A[1..n], where A[j]  {1, 2, 3, …, k}
 Output: B[1..n], sorted (notice: not sorting in
place)
 Also: Array C[1..k] for auxiliary storage

B N Pandey 7/19/2020
COUNTING-SORT(A, B, k)
1 for i ← 0 to k
2 do C[i] ← 0
3 for j ← 1 to length[A]
4 do C[A[j]] ← C[A[j]] + 1
5 //C[i] now contains the number of elements equal to i.
6 for i ← 1 to k
7 do C[i] ← C[i] + C[i - 1]
8 //C[i] now contains the number of elements less than or equal
to i.
9 for j ← length[A] downto 1
10 do B[C[A[j]]] ← A[j]
11 C[A[j]] ← C[A[j]] – 1

B N Pandey 7/19/2020
B N Pandey 7/19/2020
Running time
• The for loop of lines 1-2 takes time Θ(k), the
for loop of lines 3-4 takes time Θ(n), the for
loop of lines 6-7 takes time Θ(k), and the for
loop of lines 9-11 takes time Θ(n). Thus, the
overall time is Θ(k+n). In practice, we usually
use counting sort when we have k = O(n), in
which case the running time is Θ(n).

B N Pandey 7/19/2020
Radix Sort
• Radix sort solves the problem of card sorting
—by sorting on the least significant digit first.
The algorithm then combines the cards into a
single deck, with the cards in the 0 bin
preceding the cards in the 1 bin preceding the
cards in the 2 bin, and so on. The process
continues until the cards have been sorted on
all d digits. Only d passes through the deck are
required to sort. The algorithm is
RadixSort(A, d)
for i=1 to d
StableSort(A) on digit i
B N Pandey 7/19/2020
Given n d-digit numbers in which each digit can take
on up to k possible values, RADIXSORT correctly
sorts these numbers in Θ(d(n + k)) time.

Example:

B N Pandey 7/19/2020
Bucket sort
• Assumption: input is n reals from [0, 1)
• Basic idea: Create n linked lists (buckets) to
divide interval [0,1) into subintervals of size
1/n. Add each input element to appropriate
bucket and sort buckets with insertion sort.
Uniform input distribution  O(1) bucket size.
Therefore the expected total time is O(n).
These ideas will return when we study hash
tables.

B N Pandey 7/19/2020
• BUCKET-SORT(A)
1 n ← length[A]
2 for i ← 1 to n
3 do insert A[i] into list
4 for i ← 0 to n - 1
5 do sort list B[i] with insertion sort
6 concatenate the lists B[0], B[1], . . ., B[n - 1]
together in order

B N Pandey 7/19/2020
B N Pandey 7/19/2020
Shell sort

Shellsort, also known as Shell sort or Shell's


method, is an in-place comparison sort. It can
be seen as either a generalization of sorting by
exchange (bubble sort) or sorting by insertion
(insertion sort).The method starts by sorting
elements far apart from each other and
progressively reducing the gap between them.

B N Pandey 7/19/2020
# Sort an array a[0...n-1].
gaps = [701, 301, 132, 57, 23, 10, 4, 1]
# Start with the largest gap and work down to a gap of 1
foreach (gap in gaps)
{
# Do a gapped insertion sort for this gap size.
# The first gap elements a[0..gap-1] are already in gapped
order
# keep adding one more element until the entire array is gap
sorted
for (i = gap; i < n; i += 1)
{
# add a[i] to the elements that have been gap sorted
# save a[i] in temp and make a hole at position i

B N Pandey 7/19/2020
temp = a[i]
# shift earlier gap-sorted elements up until the
correct location for a[i] is found
for (j = i; j >= gap and a[j - gap] > temp; j -= gap)
{
a[j] = a[j - gap]
}
# put temp (the original a[i]) in its correct location
a[j] = temp
}
}

B N Pandey 7/19/2020
Example

The complexity of this algorithm is O (n 1.25).

B N Pandey 7/19/2020
The End

B N Pandey 7/19/2020

You might also like