Sorting algorithm

« Back to Glossary Index

History and Classification of Sorting Algorithms
– Sorting problem has attracted research since the beginning of computing
– Betty Holberton worked on early sorting algorithms in the 1950s
– Bubble sort was analyzed in 1956
– Asymptotically optimal algorithms have been known since the mid-20th century
– Timsort, widely used, was introduced in 2002
– Sorting algorithms can be classified by computational complexity
– They can have best, worst, and average case behavior
– Some algorithms are in-place, using minimal memory beyond the items being sorted
– Algorithms can be recursive, non-recursive, or both
– Stability refers to maintaining the relative order of records with equal keys

Comparison-based Sorting Algorithms
– Quicksort: O(n log n) average and best case, O(n^2) worst case, partitioning
– Merge sort: O(n log n) worst case, merging
– Heapsort: O(n log n) worst case, selection
– Timsort: O(n log n) worst case, insertion and merging
– Library sort: O(n log n) worst case, insertion

Non-comparison Sorting Algorithms
– Bucket sort (uniform keys): O(n + k) best and average case, O(n^2 * k) worst case, assumes uniform distribution
– MSD Radix Sort: O(n * (k/d)) best and average case, O(n + 2^d) worst case, stable version uses external array
– Spreadsort: O(n * (k/d) * 2^d) best case, O(n * ((k/s) + d) * 2^d) average case, asymptotic based on assumption
– Burstsort: O(n * (k/d)) best and average case, O(n * (k/d)) worst case, better constant factor for sorting strings
– Flashsort: O(n + r) best case, O(n^2) worst case, requires uniform distribution, in-place version not stable

Analysis of Sorting Algorithms
– Sorting networks have varying time complexity and stability.
– Bitonic sorter has a logarithmic time complexity and can be parallelized.
– Bogosort has an unbounded worst-case time complexity and uses random shuffling.
– Stooge sort has a time complexity of log 3 / log 1.5 and can be made stable.
– Slowsort has a time complexity of log n^(log n) and is a multiply and surrender algorithm.

Popular Sorting Algorithms
– Insertion sort is widely used for small data sets.
– Heapsort, merge sort, and quicksort are commonly used for large data sets.
– Hybrid algorithms combine an efficient algorithm with insertion sort for small lists.
– Timsort and introsort are more sophisticated variants used in specific implementations.
– Distribution sorts like counting sort and radix sort are used for restricted data.

Sorting algorithm (Wikipedia)

In computer science, a sorting algorithm is an algorithm that puts elements of a list into an order. The most frequently used orders are numerical order and lexicographical order, and either ascending or descending. Efficient sorting is important for optimizing the efficiency of other algorithms (such as search and merge algorithms) that require input data to be in sorted lists. Sorting is also often useful for canonicalizing data and for producing human-readable output.

Merge sort

Formally, the output of any sorting algorithm must satisfy two conditions:

  1. The output is in monotonic order (each element is no smaller/larger than the previous element, according to the required order).
  2. The output is a permutation (a reordering, yet retaining all of the original elements) of the input.

For optimum efficiency, the input data should be stored in a data structure which allows random access rather than one that allows only sequential access.

« Back to Glossary Index

Submit your RFP

We can't wait to read about your project. Use the form below to submit your RFP!