Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
A novel integer value-sorting technique is proposed replacing bucket sort, distri-bution counting sort and address calculation sort family of algorithms. It requires only constant amount of additional memory. The technique is inspired from one of the ordinal theories of "serial order in behavior" and explained by the analogy with the three main stages in the formation and retrieval of memory in cognitive neuroscience namely (i) practicing, (ii) storing and (iii) retrieval. Although not suitable for integer rank-sorting where the problem is to put an array of elements into ascending or descending order by their numeric keys, each of which is an integer, the technique seems to be efficient and applicable to rank-sorting, as well as other problems such as hashing, searching, element distinction, succinct data structures, gaining space, etc.
A novel integer sorting technique was proposed replacing bucket sort, distribution counting sort and address calculation sort family of algorithms which requires only constant amount of additional memory. The technique was inspired from one of the ordinal theories of "serial order in behavior" and explained by the analogy with the three main stages in the formation and retrieval of memory in cognitive neuroscience namely (i) practicing, (ii) storing and (iii) retrieval. In this study, the technique is improved both theoretically and practically and an algorithm is obtained which is faster than the former making it more competitive. With the improved version, n integers S[0...n-1] each in the range [0, n-1] are sorted exactly in O(n) time while the complexity of the former technique was the recursion T(n) = T(n/2) + O(n) yielding T(n) = O(n).
In the first place, a novel, yet straightforward in-place integer value-sorting algorithm is presented. It sorts in linear time using constant amount of additional memory for storing counters and indices beside the input array. The technique is inspired from the principal idea behind one of the ordinal theories of "serial order in behavior" and explained by the analogy with the three main stages in the formation and retrieval of memory in cognitive neuroscience: (i) practicing, (ii) storage and (iii) retrieval. It is further improved in terms of time complexity as well as specialized for distinct integers, though still improper for rank-sorting. Afterwards, another novel, yet straightforward technique is introduced which makes this efficient value-sorting technique proper for rank-sorting. Hence, given an array of n elements each have an integer key, the technique sorts the elements according to their integer keys in linear time using only constant amount of additional mem...
In-place associative integer sorting technique was proposed for integer lists which requires only constant amount of additional memory replacing bucket sort, distribution counting sort and address calculation sort family of algorithms. The technique was explained by the analogy with the three main stages in the formation and retrieval of memory in cognitive neuroscience which are (i) practicing, (ii) storing and (iii) retrieval. In this study, the technique is specialized with two variants one for read-only integer keys and the other for modifiable integers. Hence, a novel algorithm is obtained that does not require additional memory other than a constant amount and sorts faster than all no matter how large is the list provided that m = O (n logn) where m is the range and n is the number of keys (or integers).
In-place associative integer sorting technique was developed, improved and specialized for distinct integers. The technique is suitable for integer sorting. Hence, given a list S of n integers S[0...n-1], the technique sorts the integers in ascending or descending order. It replaces bucket sort, distribution counting sort and address calculation sort family of algorithms and requires only constant amount of additional memory for storing counters and indices beside the input list. The technique was inspired from one of the ordinal theories of "serial order in behavior" and explained by the analogy with the three main stages in the formation and retrieval of memory in cognitive neuroscience: (i) practicing, (ii) storing and (iii) retrieval. In this study in-place associative permutation technique is introduced for integer key sorting problem. Given a list S of n elements S[0...n-1] each have an integer key in the range [0,m-1], the technique sorts the elements according to t...
In-place associative integer sorting technique was proposed for integer lists which requires only constant amount of additional memory replacing bucket sort, distribution counting sort and address calculation sort family of algorithms. Afterwards, the technique was further improved and an in-place sorting algorithm is proposed where n integers S[0...n-1] each in the range [0, n-1] are sorted exactly in O(n) time while the complexity of the former technique was the recursion T(n) = T(n/2) + O(n) yielding T(n) = O(n). The technique was specialized with two variants one for read-only distinct integer keys and the other for modifiable distinct integers, as well. Assuming w is the fixed word length, the variant for modifiable distinct integers was capable of sorting n distinct integers S[0...n-1] each in the range [0, m-1] in exactly O(n) time if m < (w-logn)n. Otherwise, it sort in O(n + m/(w-logn)) time for the worst, O(m/(w-logn)) time for the average (uniformly distributed keys) a...
Sorting is an operation to arrange the elements of a data structure in some logical order. In our daily lifes, without knowing about sorting we are doing works in sorted order. So that's why everybody must need an efficient sorting technique which will solve sorting problem with in limited time. So We have discussed about various existing sorting algorithms with their advantage and disadvantage. In this paper, we have proposed a new sorting algorithm which overcomes some common disadvantage of some traditional existing algorithms by properly utilizing the memory. Here, we have compared our algorithm with traditional existing algorithms by using some factors.
Bucket sort and RADIX sort are two well-known integer sorting algorithms. This paper measures empirically what is the time usage and memory consumption for different kinds of input sequences. The algorithms are compared both from a theoretical standpoint but also on how well they do in six different use cases using randomized sequences of numbers. The measurements provide data on how good they are in different real-life situations.
2016
Sorting a sequence of numbers is an essential task that is involved in many computing algorithms and techniques. In this paper a new sorting algorithm is proposed that has broken the O(n log n) limit of the most known sorting techniques. The algorithm is designed to sort a sequence of integer numbers and may be extended to operate with decimal numbers also. The proposed algorithm offers a speed up of nearly m+3 logn − 1, where n is the size of the list and m is the size of each element in the list. The time complexity of the algorithm may be considered linear under certain constraints that should be followed in the implementation phase, while the spatial complexity is linear too. The new algorithm was given a name of Bucket Then Binary Radix Sort as a notation for the techniques which it uses.
This paper introduces a new, faster sorting algorithm (ARL - Adaptive Left Radix) that does in-place, non-stable sorting. Left Radix, often called MSD (Most Significant Digit) radix, is not new in itself, but the adaptive feature and the in-place sorting ability are new features. ARL does sorting with only internal moves in the array, and uses a dynamically defined radix for each pass. ALR is a recursive algorithm that sorts by first sorting on the most significant 'digits' of the numbers - i.e. going from left to right. Its space requirements are O(N + logM) and time performance is O(N*log M) - where M is the maximum value sorted and N is the number of integers to sort. The performance of ARL is compared with both with the built in Quicksort algorithm in Java, Arrays.sort(), and with ordinary Radix sorting (sorting from right-to-left). ARL is almost twice as fast as Quicksort if N > 100. This applies to the normal case, a uniformly drawn distribution of the numbers 0:N...
Advances in Intelligent Systems and Computing, 2017
In this paper we introduce RADULS2, the fastest parallel sorter based on radix algorithm. It is optimized to process huge amounts of data making use of modern multicore CPUs. The main novelties include: extremely optimized algorithm for handling tiny arrays (up to about a hundred of records) that could appear even billions times as subproblems to handle and improved processing of larger subarrays with better use of non-temporal memory stores.
International Journal of Computer Science and Mobile Computing, 2022
Data is the new fuel. With the expansion of the global technology, the increasing standards of living and with modernization, data values have caught a great height. Now a days, nearly all top MNCs feed on data. Now, to store all this data is a prime concern for all of them, which is relieved by the Data Structures, the systematic way of storing data. Now, once these data are stored and charged in secure vaults, it's time to utilize them in the most efficient way. Now, there are a lot of operations that needs to be performed on these massive chunks of data, like searching, sorting, inserting, deleting, merging and so more. In this paper, we would be comparing all the major sorting algorithms that have prevailed till date. Further, work have been done and an inequality in dimension of time between the four Sorting algorithms, Bubble, Selection, Insertion, Merge, that have been discussed in the paper have been proposed.
This paper presents a fast way to generate the permutation p that defines the sorting order of a set of integer keys in an integer array ‘a’that is: a[p[i]] is the i’th sorted element in ascending order. The motivation for using Psort is given along with its implementation in Java. This distribution based sorting algorithm, Psort, is compared to two comparison based algorithms, Heapsort and Quicksort, and two other distribution based algorithms, Bucket sort and Radix sort. The relative performance of the distribution sorting algorithms for arrays larger than a few thousand elements are assumed to be caused by how well they handle caching (both level 1 and level 2). The effect of caching is investigated, and based on these results, more cache friendly versions of Psort and Radix sort are presented and compared. Introduction Sorting is maybe the single most important algorithm performed by computers, and certainly one of the most investigated topics in algorithmic design. Numerous sor...
Almost all computers regularly sort data. Many different sorting algorithms have been proposed. It is known that no sort algorithm based on key comparisons can sort N keys in less than O (N log N) operations and that many perform O(N2 ) operations in the worst case . This paper is aimed at proposing a new sortie tree data structure, which can be used for the sorting of data. This algorithm that implements a sortie tree data structure is a non-comparative sorting algorithm.
An algorithm is a well-defined way that takes some input in the form of certain values, processes them and gives certain values as output. Although there is a large variety of sorting algorithms, sorting problem has appealed a great deal of research; because effective sorting is important to enhance the use of other algorithms.A novel sorting algorithm namely " V-Re-Fr (VRF) Sorting Algorithm " is proposed to address the limitations of the current popular sorting algorithms. The goal of this paper is to propose a new algorithm which will provide improved functionality and reduce algorithm complexities. The observations backed by literature survey indicates that proposed algorithm is much more efficient in terms of number of swaps or iterations than the other algorithms having O(n2) complexity, like insertion, selection and bubble sort algorithms.
2015
Sorting is considered as a very basic operation in computer science. Sorting is used as an intermediate step in many operations. Sorting refers to the process of arranging list of elements in a particular order either ascending or descending using a key value. There are a lot of sorting algorithms have been developed so far. This research paper presents the different types of sorting algorithms of data structure like Bubble Sort, Selection Sort, Insertion Sort, Merge Sort, Heap Sort and Quick Sort and also gives their performance analysis with respect to time complexity. These six algorithms are important and have been an area of focus for a long time but still the question remains the same of "which to use when?" which is the main reason to perform this research. Each algorithm solves the sorting problem in a different way. This research provides a detailed study of how all the six algorithms work and then compares them on the basis of various parameters apart from time c...
Sorting is an essential operation which is widely used and is fundamental to some very basic day to day utilities like searches, databases, social networks and much more. Optimizing this basic operation in terms of complexity as well as efficiency is cardinal. Optimization is achieved with respect to space and time complexities of the algorithm. In this paper, a novel left-field N-dimensional cartesian spaced sorting method is proposed by combining the best characteristics of bucket sort, counting sort and radix sort, in addition to employing hashing and dynamic programming for making the method more efficient. Comparison between the proposed sorting method and various existing sorting methods like bubble sort, insertion sort, selection sort, merge sort, heap sort, counting sort, bucket sort, etc., has also been performed. The time complexity of the proposed model is estimated to be linear i.e. ( ) for the best, average and worst cases, which is better than every sorting algorithm introduced till date.
2020
The problem of sorting is a problem that arises frequently in computer programming and though which is need to be resolved. Many different sorting algorithms have been developed and improved to make sorting optimized and fast. As a measure of performance mainly the average number of operations or the average execution times of these algorithms have been compared. There is no one sorting method that is best for every situation. Some of the factors to be considered in choosing a sorting algorithm include the size of the list to be sorted, the programming effort, the number of words of main memory available,the size of disk or tape units, the extent to which the list is already ordered, and the distribution of values.
— An algorithm is precise specification of a sequence of instruction to be carried out in order to solve a given problem. Sorting is considered as a fundamental operation in computer science as it is used as an intermediate step in many operations. Sorting refers to the process of arranging list of elements in a particular order. The elements are arranged in increasing or decreasing order of their key values. This research paper presents the different types of sorting algorithms of data structure like Bubble Sort, Selection Sort, Insertion Sort, Merge Sort and Quick Sort and also gives their performance analysis with respect to time complexity. These five algorithms are important and have been an area of focus for a long time but still the question remains the same of " which to use when? " which is the main reason to perform this research. Each algorithm solves the sorting problem in a different way. This research provides a detailed study of how all the five algorithms work and then compares them on the basis of various parameters apart from time complexity to reach our conclusion. I. INTRODUCTION Algorithm is an unambiguous, step-by-step procedure for solving a problem, which is guaranteed to terminate after a finite number of steps. In other words algorithm is logical representation of the instructions which should be executed to perform meaningful task. For a given problem, there are generally many different algorithms for solving it. Some algorithms are more efficient than others, in that less time or memory is required to execute them. The analysis of algorithms studies time and memory requirements of algorithms and the way those requirements depend on the number of items being processed. Sorting is generally understood to be the process of rearranging a given set of objects in a specific order and therefore, the analysis and design of useful sorting algorithms has remained one of the most important research areas in the field. Despite the fact that, several new sorting algorithms being introduced, the large number of programmers in the field depends on one of the comparison-based sorting algorithms: Bubble, Insertion, Selection sort etc. Hence sorting is an almost universally performed and hence, considered as a fundamental activity. The usefulness and significance of sorting is depicted from the day to day application of sorting in real-life objects. For instance, objects are sorted in: Telephone directories, income tax files, tables of contents, libraries, dictionaries. The methods of sorting can be divided into two categories: INTERNAL SORTING: If all the data that is to be sorted can be adjusted at a time in main memory, then internal sorting methods are used. EXTERNAL SORTING: When the data to be sorted can " t be accommodated in the memory at the time and some has to be kept in auxiliary memory (hard disk, floppy, tape etc) , then external sorting method are used. The complexity of a sorting algorithm measures the running time of function in which " n " numbers of items are sorted. The choice of which sorting method is suitable for a problem depends on various efficiency considerations for different problem. Three most important of these considerations are: The length of time spent by programmer in coding a particular sorting program. Amount of machine time necessary for running the program. The amount of memory necessary for running program. Stability-does the sort preserve the order of keys with equal values.
2008
In-place sorting algorithms play an important role in many fields such as very large database systems, data warehouses, data mining, etc. Such algorithms maximize the size of data that can be processed in main memory without input/output operations. In this paper, a novel in-place sorting algorithm is presented. The algorithm comprises two phases; rearranging the input unsorted array in place, resulting segments that are ordered relative to each other but whose elements are yet to be sorted. The first phase requires linear time, while, in the second phase, elements of each segment are sorted inplace in the order of z log (z), where z is the size of the segment, and O(1) auxiliary storage. The algorithm performs, in the worst case, for an array of size n, an O(n log z) element comparisons and O(n log z) element moves. Further, no auxiliary arithmetic operations with indices are required. Besides these theoretical achievements of this algorithm, it is of practical interest, because of...
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.