0% found this document useful (0 votes)
84 views8 pages

Research Article: SA Sorting: A Novel Sorting Technique For Large-Scale Data

Uploaded by

kings cliff
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
84 views8 pages

Research Article: SA Sorting: A Novel Sorting Technique For Large-Scale Data

Uploaded by

kings cliff
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Hindawi

Journal of Computer Networks and Communications


Volume 2019, Article ID 3027578, 7 pages
https://doi.org/10.1155/2019/3027578

Research Article
SA Sorting: A Novel Sorting Technique for Large-Scale Data

1
Mohammad Shabaz and Ashok Kumar2
1
Apex Institute of Technology, Chandigarh University, S.A.S. Nagar, India
2
Chitkara University Institute of Engineering and Technology, Chitkara University, Rajpura, India

Correspondence should be addressed to Mohammad Shabaz; [email protected]

Received 19 August 2018; Accepted 6 December 2018; Published 23 January 2019

Academic Editor: Youyun Xu

Copyright © 2019 Mohammad Shabaz and Ashok Kumar. This is an open access article distributed under the Creative Commons
Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is
properly cited.
Sorting is one of the operations on data structures used in a special situation. Sorting is defined as an arrangement of data or
records in a particular logical order. A number of algorithms are developed for sorting the data. The reason behind developing
these algorithms is to optimize the efficiency and complexity. The work on creating new sorting approaches is still going on. With
the rise in the generation of big data, the concept of big number comes into existence. To sort thousands of records either sorted or
unsorted, traditional sorting approaches can be used. In those cases, we can ignore the complexities as very minute difference
exists in their execution time. But in case the data are very large, where execution time or processed time of billion or trillion of
records is very large, we cannot ignore the complexity at this situation; therefore, an optimized sorting approach is required. Thus,
SA sorting is one of the approaches developed to check sorted big numbers as it works better on sorted numbers than quick sort
and many others. It can also be used to sort unsorted records as well.

1. Introduction of passes and n(n − 1)/2 number of iterations in totality.


Mathematically,
Sorting big numbers in an optimized way is a challenging
n(n − 1) n2 − n 􏼁
task to perform. A number of such sorting algorithms exist F(n) � � , (1)
which are optimized, but their execution time is still to be 2 2
optimized. Sometimes, these algorithms take the same and thus,
amount of time to sort a sorted record as they take to sort an
unsorted record. The sorting algorithm should be stable, F(n) ≤ n2 . (2)
effective, less complex, and efficient. SA sorting follows most The algorithm for bubble sort is given in Algorithm 1.
of these parameters. SA sorting is introduced as a new In this algorithm, if no swap occurs, it will break the loop
approach to operate on both sorted and unsorted lists or and directly go to end, and thus, only one loop executes that
records and shows better execution time on the sorted list. determines best-case analysis which becomes O(n). On the
The following section discusses the existing sorting contrary, the complexity of average and worst cases would be
approaches. O(n2 ). Bubble sort is highly code inefficient, and it is one of
the worst sorting approaches, so professionals do not use it.
2. Sorting Techniques
2.1. Bubble Sort. It is a stable sorting algorithm in which each 2.2. Insertion Sort. It is the stable sorting algorithm in which
element in the list is compared with its next adjacent ele- starting from the first number from the record, this first
ment, and the process is repeated until the elements are number is compared to every adjacent number to find its
sorted. If we have n elements, then there are (n − 1) number sorted position. When the position is found, the number is
2 Journal of Computer Networks and Communications

(1) for X � 1 to n do (1) for e � 1 to length(X) do


(2) set Nswap ⟵ 0 (2) set k ⟵ X[e]
(3) for y � n to x + 1 do (3) Insert X[e] to sorted list x[1..e − 1]
(4) if (a[y] < a[y − 1]) then (4) L ⟵ e − 1
(5) T �a[y] (5) while L > 0 and X[L] > k do
(6) a[y] � a[y − 1] (6) X[L − 1] ⟵ X[L]
(7) a[y − 1] � T (7) L⟵L−1
(8) Nswap � 1 (8) X[L + 1] ⟵ K
(9) end if (9) end while
(10) end for (10) end for
(11) if Nswap � 0 then
(12) Break
ALGORITHM 2: Insertion sort.
(13) end if
(14) end for

ALGORITHM 1: Bubble sort. (1) for d ≔ 0 to S − 2 do


(2) set sml ⟵ d
(3) for f ≔ d − 1 to S − 1 do
inserted there. The algorithm for insertion sort is given in (4) if A[f] < A[Sml] then
(5) set sml ⟵ f
Algorithm 2.
(6) end if
Insertion sort holds good for smaller datasets since it is (7) end for
also code inefficient for large data/lists or big numbers. (8) T ⟵ A[d]
Insertion sort is 2 times faster than bubble sort. For the best (9) A[d] ⟵ A[Sml]
case, it is O(n), while for the average or worst case, it is (10) A[Sml] ⟵ T
O(n2 ). (11) end for

ALGORITHM 3: Selection sort.


2.3. Selection Sort. By nature, selection sort is unstable, but it
can be improved to become stable. In this sorting technique,
we are supposed to find the smallest number from the re-
cord, put this number at the starting position, change the (1) if S < T then
position to next, and then again find the smallest number (2) U ⟵ (S + T)/2
from the remaining list. This process goes on and on, until (3) Merge Sort (A, S, U)
the whole list becomes sorted. This algorithm is efficient for (4) Merge Sort (A, U + 1, T)
smaller records, but for larger records, this technique is (5) Merge (A, S, U, T)
again code inefficient. The algorithm is given in Algorithm 3. (6) end if
Its execution time is better for smaller data/records (up
to hundreds of records). The best-, average-, or worst-case ALGORITHM 4: Merge sort (A, S, T).
complexity is the same as O(n2 ).
n
T(n) � 2T􏼒 􏼓 + cn. (3)
2.4. Merge Sort. It is a stable sorting algorithm and is very 2
efficient to handle big numbers. It is based on the following Using the master method,
three steps [1]: F(n) � cn, a � 2, and b � 2, and thus,
(1) Divide the given list or record into number of nlogb a � nlog2 2 � n1 � n. (4)
sublists in such a way that every list or sublist is half
divided Now,
(2) Conquer the sublist by the recursion method T(n) � O(n log n). (5)
(3) Combine the sorted sublist simply by merging
For all three cases, it would be O(n log n).
The sorting is actually done in the second step. Merge
sort is the only sorting technique which is purely based on
the divide-and-conquer technique of solving the algorithm. 2.5. Quick Sort. It is unstable but efficient and is one of the
It requires double the memory as required by other sorting fast working sorting algorithms. It is based on the divide-
techniques. The algorithm for merge sort is given in Al- and-conquer strategy. While talking about quick sort at an
gorithms 4 and 5. instant, in our mind, there comes the concept of pivot el-
From the algorithm, it is analyzed that merge sort has the ement. The pivot element is one of the randomly chosen
following recurrence relation: members from the list which is under the operation of
Journal of Computer Networks and Communications 3

(1) IntB[ST] (1) if T > S then


(2) x⟵y⟵s (2) X ⟵ randomnumberfromlist[ST]
(3) z⟵U + 1 (3) Interchange A[X] with A[S]
(4) while (x ≤ U) and (z ≤ T) do (4) U ⟵ Partition(A, S, T)
(5) if A[x] ≤ A[z] then (5) Quick Sort (A, S, U − 1)
(6) B[y++] ⟵ A[x++] (6) Quick Sort (A, U + 1, T)
(7) else (7) end if
(8) B[y++] ⟵ A[z++]
(9) end if
ALGORITHM 6: Quick sort (A, S, T).
(10) end while
(11) while (x ≤ U) do
(12) B[y++] ⟵ A[x++]
(13) end while
(1) Z ⟵ A[S]
(14) while (z ≤ T) do
(2) U⟵S
(15) B[y++] ⟵ A[z++]
(3) for L ⟵ S + 1 to T do
(16) end while
(4) if (A[L] < Z) then
(17) for X � s to t do
(5) U⟵U + 1
(18) A[x] ⟵ B[x]
(6) end if
(19) end for
(7) Interchange A[U] with A[L]
(8) Interchange A[S] with A[U]
ALGORITHM 5: Merge (A, S, U, T). (9) end for
(10) Return U

sorting. It works well for both smaller and larger lists or ALGORITHM 7: Partition (A, S, T).
records, but in case if the list is sorted, some results are
sometimes obtained that we cannot even imagine. It is built
on the recursion phenomenon. The algorithm is given in the sorted list but also on the unsorted list. The algorithm is
Algorithms 6 and 7. given in Algorithm 8.
The partition function works as follows: From the algorithms, it is clearly seen that if the list is
sorted, no interchange of elements is done; hence, it executes
A[S] � Z //(Pivot)
linearly. Thus, for the best case, it is O(n), and for average
A[SU − 1] //value < Z and worst cases, it is O(n2 ).
A[U + 1L − 1] //value ≥ Z
A[L . . . T] //unknown values 2.8. Counting Sort. It is a stable and easily understandable
Complexity analysis: sorting algorithm. As the name depicts, counting sort works
by finding the largest element in the given list/record, and
Let m : size and m � 2n then starting from the least element/number, its frequency is
Comparisons � m + 2 ∗ (m/2) m ∗ (m/m) counted and at last the sorted list is produced maintaining
m + m + m (n term) the order of its occurrence while sorting. It is useful in those
cases where the difference between the numbers is very small
O (nm) and the dataset is also very small. The step-by-step procedure
If n � m, then it is the worst-case scenario and of counting sort is discussed in Algorithm 9.
complexity � O(m2 ). But for the average- or best-case sce-
nario, n � log2 m and then complexity would be 2.9. Grouping Comparison Sort (GCS). Suleiman with his
O(m log2 m). team of three other members proposed the GCS algorithm.
The methodology they have used is to divide the given list/
record into groups. Each of these groups contains three
2.6. Tree Sort. It is an unstable sorting algorithm built on the
elements, and comparison is done in such a way that every
binary search tree (BST). The element in tree sort is sorted
element of one group is compared to that in other groups.
using in-order traversing operation. Tree sort requires extra
The main drawback of this algorithm is that the input size
memory space, and complexity changes from balanced BST
must be less than or equal to 25000 records to get the better
to unbalanced BST. The complexity of the tree sort is O(n2 )
results. The complexity becomes O(n2 ) for all the three cases.
for the worst case and O(n log n) for average and best cases.

2.10. Heap Sort. Heap sort is a stable but efficient sorting


2.7. Gnome Sort. It is a stable sorting approach. When we algorithm which is based on the complete binary tree and
think that the list or record is sorted but we are not sure, we follows the heap order. Heap sort contains min heap, in
need an algorithm which works best on the sorted list; for which the root node is having the minimum value, and max
this purpose, we use gnome sort. It performs well not only on heap, in which the root node is having the maximum value.
4 Journal of Computer Networks and Communications

(1) K ⟵ 1 (1) BH eap(A, X)


(2) while K < X[].length do (2) a⟵X
(3) if (X[K] ≥ X[K − 1]) then (3) while a ≥ 2 do
(4) K⟵K + 1 (4) A[L] ⟵ A[a]
(5) else (5) A[a] ⟵ Y
(6) T ⟵ X[K] (6) a⟵a−L
(7) X[K] ⟵ X[K − 1] (7) Heapy(A, L, a)
(8) X[K − 1] ⟵ T (8) end while
(9) if (K > 1) then
(10) K⟵K−1
ALGORITHM 10: Heap sort (A, X).
(11) end if
(12) end if
(13) end while
(1) L ⟵ Lf(k)
ALGORITHM 8: Fgnome (X[]). (2) R ⟵ Lr(k)
(3) maximum ⟵ k
(4) if (L ≤ a) and (A[L] > A[maximum]) then
(5) maximum ⟵ L
(6) end if
(1) for a ⟵ 1 to m do (7) if (R ≤ a) and (A[R] > A[maximum]) then
(2) D[a] ⟵ 0 (8) maximum ⟵ R
(3) for b ⟵ 1 to length(P) do (9) end if
(4) D[P[b]] ⟵ D[P[b]] + 1 (10) if (maximum! � k)
(5) end for (11) t ⟵ A[k]
(6) for b ⟵ length[P]below1 do (12) end if
(7) Q[D[P[b]]] ⟵ P[b] (13) A[k] ⟵ A[maximum]
(8) D[P[b]] ⟵ D[P[b]] − 1 (14) A[maximum] ⟵ t
(9) end for (15) Heapy(A, maximum, a)
(10) end for

ALGORITHM 11: Heapy (A, L, a).


ALGORITHM 9: Counting sort(P, Q, T).

The procedure of heap sort is explained through Algo- (1) Change ⟵ 1


rithms 10 and 11. (2) for k � 1 to size do
For heap sort, in all the three cases (i.e., best, average, and (3) for e � 1 to a do
worst cases), the complexity is the same as O(n log n), where (4) z � (X[e].size/change)mod10
n is the number of records in the list. (5) append(bucket[z], X[e])
(6) X � combinebuckets()
(7) Change ⟵ Change ∗ 10
(8) end for
2.11. Radix Sort. It is a stable and efficient sorting algorithm
(9) end for
when the size of list/record is small. Internally, it acts like
counting sort. One of the drawbacks of radix sort is it
operates on one number many times as it has the number of ALGORITHM 12: Radix sort(X, a).
significant bits on a digit. Suppose if there is a number 169,
the radix sort operates on it three times, sorting from the
least significant digit 9 to the most significant digit 1 in the side and smallest element to the head side. The head side and
list. Radix sort compares the LSB of all the numbers in a tail side are shown in Figure 1.
similar way to proceed with further results in a sorting list. Bubble sort puts the biggest element to the tail side after
The procedure of radix sort is explained in Algorithm 12. every pass, while cocktail sort puts the smallest element to
If the size is the longest length m, then there are a el- the head side and the biggest element to the tail side after
ements and the complexity would be O(m ∗ n). If the size every pass. The complexity of cocktail sort for worst and
varies to a constant number, then the size is ignored and the average cases is O(n2 ), but for the best case, it is O(n).
complexity becomes O(n) for all the three cases.

2.13. Comb Sort. It is the stable and another improved


2.12. Cocktail Sort. It is a stable and efficient sorting algo- version of bubble sort as it changes the gap size from 1 to 1.3
rithm as compared to bubble sort as it is the extended for every iteration. Gap size tells the algorithm to swap. As
version of bubble sort. Cocktail sort works on both sides of the gap size increases, the number of swaps decreases; thus,
the list. During sorting, it puts the largest element to the tail on the average-case scenario, comb sort performs better. But
Journal of Computer Networks and Communications 5

(1) X⟵1
Head end Tail end (2) while x < m do
(3) x⟵3∗x + 1
Figure 1: Working of cocktail sort.
(4) end while
(5) while x > 0 do
(6) x ⟵ h/3
for the worst case, it remains the same as O(n2 ). For the best (7) end while
case, it is O(n) as no swap is done. (8) for (i � 1 : x) do
(9) InsertionI[i : x : m] ⟶ invariant : x − sublistissorted
(10) end for
2.14. Enhanced Selection Sort. It is the extended version of
selection sort which is made stable by decreasing the regular ALGORITHM 13: Shell sort.
size of the list. This is done in such a way that first the biggest
element is found, a swap is made, the size of the list is de-
creased by 1, and then the sort is again performed in a similar
way to get the list sorted. Although the complexity would be (1) Y[0....a − 1]
the same as that of selection sort but in the best-case scenario, (2) a ⟵ X.len
the number of swaps would be zero for enhanced selection (3) for (k � 0 to a − 1) do
sort. The complexity for all the three cases would be O(n2 ). (4) B[k] � Empty
(5) end for
(6) for(k � 0 to a) do
2.15. Shell Sort. It is the unstable and efficient sorting al- (7) putX[k]intoY[aX[k]]
gorithm which is the extended version of insertion sort. Shell (8) end for
(9) for (k � 0 to a − 1) do
sort works well if the given list is partially sorted. This means
(10) SortY[k]usinginsertionsorttechnique
we are discussing about the average-case scenario. Shell sort (11) end for
uses Knuth’s formula to calculate the interval or spacing. (12) joinY[0], Y[1] . . . .Y[a − 1]
This formula is given as follows: (13) while x > 0 do
(14) x ⟵ h/3
x � x ∗ 3 + 1, (6) (15) end while
where x has the starting value 1 and is called the interval/
spacing. Shell sort divides the given list into sublists by using ALGORITHM 14: Bucket sort ().
an increment which is called the gap, compares the elements,
and then interchanges them depending upon the order ei-
ther increasing or decreasing. The algorithm for shell sort is sorting techniques are used. The complexity for best, av-
given in Algorithm 13. erage, and worst cases is O(n log n).
For best and worst cases, the complexity is O(n log n)
and O(n2 ), respectively. For the average case, it depends
upon the gap. 2.18. Even-Odd/Brick Sort. It is another extension of bubble
sort, in which the BS algorithm is partitioned into two parts:
even part and odd part. Even part consists of even numbers,
2.16. Bucket Sort. It is a stable sort and consists of bucket. and odd part consists of odd numbers. Both the parts are
The elements are inserted in the bucket, and then, the sorting executed one by one, and at last, the combined result is
is employed on the bucket. Bucket sort does not have obtained and the records are sorted. It is a stable algorithm
comparisons and uses the index to sort. These indexes are with the same complexity as the bubble sort:
not just obtained from any mathematical function but are
obtained in such a way that they could satisfy the order of best case ⟶ O(n),
arrangement of numbers inserted in the bucket. The pro- worst case ⟶ O􏼐n2 􏼑, (7)
cedure is explained in Algorithm 14.
2
The complexity for bucket sort would be O(n) for all the average case ⟶ O􏼐n 􏼑.
three cases.

2.17. Tim Sort. It is a combination of insertion and merge 2.19. Bitonic Sort. It is introduced through the concept of
sort. It is a stable and efficient sorting technique in which the merge sort. In bitonic sort, we move the list to level L − 1 with
list or record is split into blocks called “run.” If the size of list two parts: left part and right part; the left part is settled in an
or record is less than run, then it can be sorted using in- increasing order, and the right part is settled in a decreasing
sertion sort. The maximum size of the run would be 64 order [2]. These parts are merged, moved to level L, and then
depending upon the list or record. But if the size of the sorted to form the sorted sequence. The complexity for bitonic
unsorted list is very large, then both insertion and merge sort is O(n2 ) on best, worst, and average cases.
6 Journal of Computer Networks and Communications

3. Literature Review and Related Work


(1) int d[], n, f[] � {0, 0}, p, j, t, i, j
Ali [3] discussed the number of sorting algorithms in this (2) for(i � 0 to n) do
paper. Evaluation of time complexity, stability, and in-place (3) j ⟵ 0
nature is done. Ali found their running time in the virtual (4) while f[j]! � 0 and j < n do
and real environment and suggested where to use a par- (5) j⟵j + 1
ticular algorithm so that the result obtained is efficient, and (6) end while
Ali concluded that quick sort is a better option for sorting in (7) if (j < n) then
(8) p⟵j
the average-case scenario; counting, bucket, and radix sorts
(9) f[p] ⟵ 1
are efficient for smaller size of the list/record and on integer- (10) else
type data. (11) exit(0)
Hammad [4] compared three sorting algorithms, namely, (12) end if
bubble, selection, and gnome sorts, based on their average (13) j ⟵ p + 1
running time in this paper. Hammad took the number of (14) while j < n do
readings to find the running time and concluded that (15) if (d[p] > d[j]) then
whenever the record list is sorted, gnome sort appears as the (16) t ⟵ d[p]
fastest sorting algorithm, but when the list or record is un- (17) d[p] ⟵ a[j]
sorted, gnome sort took the same running time as bubble sort (18) d[j] ⟵ t
or selection sort has in their worst or average case (i.e. O(n2 )). (19) t ⟵ f[p]
(20) f[p] ⟵ f[j]
Elkahlout and Maghari [5] discussed the two advanced
(21) f[j] ⟵ t
versions of bubble sort, namely, comb and cocktail sort, and (22) p⟵j
one linear-time technique, namely, counting sort, in this paper. (23) end if
Comparing these techniques, they concluded that cocktail sort (24) j⟵j + 1
performs better on average evaluation of their process time. (25) end while
All these algorithms are graphically implemented in this (26) end for
paper. The main focus is on time complexity.
Jehad and Rami [6] made some changes in the bubble ALGORITHM 15: SA sort ().
sort and selection sort so as to reduce the number of swaps
during sorting operation. In this paper, the author follows
position is not changed until the targeted element is found as
the same procedure, compares the enhanced version of
is already operated. When the targeted element is found to
bubble sort and selection sort with the original bubble sort
be already operated, the position is changed to next by 1. In
and selection sort, and then reduces the execution time. The
this way, SA sorting works. The step-by-step process of SA
complexity of enhanced bubble sort is reduced from O(n2 )
sorting is given in Algorithm 15.
to O(n log n) and remains the same as that of selection sort.
The number of comparisons, C, for SA sorting is given as
Pankaj [7] using the C programming language compared
follows:
five sorting techniques, namely, bubble, selection, quick,
merge, and insertion sort, on the basis of average running (n(n − 1))
C �􏼢 􏼣 + S, (8)
time. The execution time is calculated in microseconds, and 2
he concluded that quick sort is a better option for sorting the
number of elements from 10 to 10000; the paper graphically where S is the number of swaps. In the best case, S � 0, so
represents the average running time of each algorithm. (n(n − 1))
C �􏼢 􏼣,
Khalid et al. [8] proposed a new algorithm, namely, 2
grouping comparison sort, which is then compared with the (9)
traditional sorting technique. This proposed algorithm has a n

limitation of having an input size of 25000 elements. As the or 􏽘 i − 1.


i�1
input size increases, the results become horrible. All these
above-discussed papers use the traditional sorting algo- Let T (n) � [(n(n − 1))/2], T (1) � 0, T (2) � 1, and T (3) � 3.
rithms and compare them to get the average running time Now creating the recurrence relation,
and conclude which of them is better to use. n
T(n) � 9T􏼒 􏼓 + n. (10)
3
4. SA Sorting
Using induction to prove,
Starting from the left extreme end of the list/record, the first At n � 3, T (3) � 9T (3/3) + 3 � 9T (1) + 3 � 3
element is obtained as the target. The target is compared with At n � 6, T (6) � 9T (6/3) + 3 � 9T (2) + 6 � 15
all the other elements until the smaller element is found. The
At n � 9, T (9) � 9T (9/3) + 3 � 9T (3) + 9 � 36, and so
target is swapped with the smaller element and continuously
on.
compared till the extreme right end of the list. Again, going
back to the target position, a new target element is taken at Solving using the master method, a � 9, b � 3, f (n) � n,
that position. Similarly, it is processed in the same way. The and nlogb a � nlog3 9 � n2 > n; thus, T(n) � O(n2 )
Journal of Computer Networks and Communications 7

Table 1: Comparison of execution time in microseconds.


S. no. No. of entries Nature of entry Quick Merge Heap Insertion Selection Shell SA
1 100 Unsorted 67 75 44 65 76 33 109
2 500 Unsorted 150 165 185 207 495 114 1078
3 1k Unsorted 232 271 231 553 1722 373 4223
4 5k Unsorted 1207 1389 2141 19335 39865 1452 124973
5 10k Unsorted 2359 2826 4588 67283 158382 3100 506885
6 20k Unsorted 4932 5934 9747 268644 627031 6710 2054339
7 1k Sorted 3492 265 377 144 1736 156 3146

Table 2: Comparison of memory used in MB.


S. no. No. of entries Nature of entry Quick Merge Heap Insertion Selection Shell SA
1 1k Unsorted 1.503 1.503 3.110 1.502 1.514 3.109 1.350
2 5k Unsorted 1.561 1.570 3.242 1.620 1.520 3.242 1.611
3 10k Unsorted 1.630 1.640 3.300 1.600 1.678 3.343 1.615
4 20k Unsorted 1.800 1.795 3.385 1.622 1.771 3.424 1.786
5 1k Sorted 1.470 1.491 3.230 1.420 1.510 3.105 1.413

However, f(n) is a polynomial smaller than nlogb a . SA Data Availability


sorting can be optimized in the future, but in comparison
with the optimized quick sort and merge sort, it performs Our article is purely based on algorithm design. We evolve
better on the already sorted list. For worst and average cases, our results from unsorted and sorted data files which include
complexity is O(n2 ) + S. Since S is included in C, it can be different records, in order to compare the algorithm with
neglected; thus, for all the three cases, it is O(n2 ). already established algorithms. Thus, no such proper data
have been used.
5. Results
Conflicts of Interest
The proposed sorting technique is implemented in C++ and
The authors declare that they have no conflicts of interest.
tested with different numbers of elements. The performance
of the SA sorting is measured in terms of execution time and
memory required for sorting. The comparison of execution
References
time and memory used by existing sorting techniques with [1] T. H. Cormen, Introduction to Algorithms, MIT press, Cam-
SA sorting is shown in Tables 1 and 2. As moving from lower bridge, MA, USA, 2009.
dataset to higher with sorted nature of the dataset, we found [2] E. Horowitz and S. Sahni, Fundamentals of Computer Algo-
that SA sorting improves and performs better. Discussing rithms, Computer Science Press, Cambridge, MA, USA, 1978.
about memory requirement from lower dataset to higher, [3] K. Ali, “A comparative study of well known sorting algo-
only slight change can be seen. rithms,” International Journal, vol. 8, no. 1, 2017.
[4] J. Hammad, “A comparative study between various sorting
algorithms,” International Journal of Computer Science and
6. Conclusion Network Security (IJCSNS), vol. 15, no. 3, p. 11, 2015.
[5] A. H. Elkahlout and A. Y. A. Maghari, A comparative study of
While implementing all these sorting techniques and sorting algorithms comb, cocktail and counting sorting, 2017.
comparing them with SA sorting, the following points are [6] A. Jehad and M. Rami, “An enhancement of major sorting
concluded: algorithms,” International Arab Journal of Information Tech-
nology, vol. 7, no. 1, 2010.
(1) If we increase the space, the time reduces as shell sort
[7] P. Sareen, “Comparison of sorting algorithms (on the basis of
and heap sort do average case),” International Journal of Advanced Research in
(2) The sorting techniques which work well on unsorted Computer Science and Software Engineering, vol. 3, no. 3,
records are not very good on sorted records as quick pp. 522–532, 2013.
sort and merge sort do [8] K. S. Al-Kharabsheh, I. M. AlTurani, A. M. I. AlTurani, and
N. I. Zanoon, “Review on sorting algorithms a comparative
(3) In the worst-case scenario, most of the sorting study,” International Journal of Computer Science and Security
techniques rely on O(n2 ) as SA sorting does (IJCSS), vol. 7, no. 3, pp. 120–126, 2013.
(4) No sorting technique is universally used, and its
usage depends upon their nature and users
requirement
(5) SA sorting needs to be improved and optimized in
the future
International Journal of

Rotating Advances in
Machinery Multimedia

The Scientific
Engineering
Journal of
Journal of

Hindawi
World Journal
Hindawi Publishing Corporation Hindawi
Sensors
Hindawi Hindawi
www.hindawi.com Volume 2018 http://www.hindawi.com
www.hindawi.com Volume 2018
2013 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018

Journal of

Control Science
and Engineering

Advances in
Civil Engineering
Hindawi Hindawi
www.hindawi.com Volume 2018 www.hindawi.com Volume 2018

Submit your manuscripts at


www.hindawi.com

Journal of
Journal of Electrical and Computer
Robotics
Hindawi
Engineering
Hindawi
www.hindawi.com Volume 2018 www.hindawi.com Volume 2018

VLSI Design
Advances in
OptoElectronics
International Journal of

International Journal of
Modelling &
Simulation
Aerospace
Hindawi Volume 2018
Navigation and
Observation
Hindawi
www.hindawi.com Volume 2018
in Engineering
Hindawi
www.hindawi.com Volume 2018
Engineering
Hindawi
www.hindawi.com Volume 2018
Hindawi
www.hindawi.com www.hindawi.com Volume 2018

International Journal of
International Journal of Antennas and Active and Passive Advances in
Chemical Engineering Propagation Electronic Components Shock and Vibration Acoustics and Vibration
Hindawi Hindawi Hindawi Hindawi Hindawi
www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018

You might also like