Computer Science

Quick Sort

Quick Sort is a sorting algorithm that uses a divide-and-conquer approach to sort an array of elements. It works by selecting a pivot element and partitioning the array into two sub-arrays, one with elements smaller than the pivot and the other with elements larger than the pivot. The process is repeated recursively until the entire array is sorted.

Written by Perlego with AI-assistance

10 Key excerpts on "Quick Sort"

  • Book cover image for: Algorithms and Theory of Computation Handbook, Volume 1
    eBook - PDF
    It is a very good choice for an internal sorting algorithm. Sorting by selection with an array having a heap property is also used for external sorting. 3.3.1.4 Quicksort For many applications a more realistic measure of the time complexity of an algorithm is its expected time. In sorting, a classical example is quicksort [19,29], which has an optimal expected time complexity of O ( n log n ) under the decision tree model, while there are sequences that force it to perform Ω ( n 2 ) operations (in other words, its worst-case time complexity is quadratic). If the 3 -10 General Concepts and Techniques worst-case sequences are very rare, or the algorithm exhibits small variance around its expected case, then this type of algorithm is suitable in practice. Several factors have made quicksort a very popular choice for implementing a sorting routine. Algorithm quicksort is a simple divide-and-conquer concept; the partitioning can be done in a very short loop and is also conceptually simple; its memory requirements can be guaranteed to be only logarithmic on the size of the input; the pivot can be placed in a register; and, most importantly, the expected number of comparisons is almost half of the worst-case optimal competitors, most notably heapsort. In their presentation, many introductory courses on algorithms favor quicksort. However, it is very easy to implement quicksort in a way that it seems correct and extremely efficient for many sorting situations. But, it may be hiding O ( n 2 ) behavior for a simple case (for example, sorting n equal keys). Users of such library routines will be satisfied initially, only to find out later that on something that seems a simple sorting task, the implementation is consuming too much time to finish the sort. Fine tuning of quicksort is a delicate issue [5]. Many of the improvements proposed may be compensated by reduced applicability of the method or more fragile and less clear code.
  • Book cover image for: Programming with C++
    • Kyla McMullen, Elizabeth Matthews, June Jamrich Parsons, , Kyla McMullen, Kyla McMullen, Elizabeth Matthews, June Jamrich Parsons(Authors)
    • 2021(Publication Date)
    If your data set is already mostly sorted, bubble sort has the best-case runtime of O(n) because it can stop early. Otherwise, use a different sort. 24.3 QUICKSORT Defining the Quicksort Algorithm (24.3.1, 24.3.2, 24.3.4) Quicksort approximates an efficient sort. It relies on probabilities to ensure it generally performs well, despite having an undesirable worst-case scenario. Quicksort uses a divide-and-conquer approach. Like bubble sort, quicksort has two parts: a partitioning step and a recursive step. In effect, the partitioning step picks a random item as a pivot, then figures out where that one item belongs in the sorted order. Imagine that to sort your unordered contacts pages, you pick the page for your friend Megan Lee. You put Megan’s page aside and create two piles: one for contact pages that should come before Megan’s page, and another pile for contact pages that should come after Megan’s page. After you have two piles, then you know that Megan’s page goes between the two piles, as in Figure 24-9. Copyright 2022 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. Module 24 Sorting AlgorithMS 439 A quicksort algorithm selects the middle item as the pivot, and then moves the pivot out of the way, as in Figure 24-10. Figure 24-10 Quicksort pivot selection and movement Items are unsorted. Middle item is selected as pivot. Pivot moves to the end of the items.
  • Book cover image for: Data Structures and Algorithms in Java
    • Michael T. Goodrich, Roberto Tamassia, Michael H. Goldwasser(Authors)
    • 2014(Publication Date)
    • Wiley
      (Publisher)
    For larger data sets, the median of more than three potential pivots might be computed. Hybrid Approaches Although quick-sort has very good performance on large data sets, it has rather high overhead on relatively small data sets. For example, the process of quick- sorting a sequence of eight elements, as illustrated in Figures 13.10 through 13.12, involves considerable bookkeeping. In practice, a simple algorithm like insertion- sort (Section 3.1.2) will execute faster when sorting such a short sequence. It is therefore common, in optimized sorting implementations, to use a hybrid approach, with a divide-and-conquer algorithm used until the size of a subsequence falls below some threshold (perhaps 50 elements); insertion-sort can be directly invoked upon portions with length below the threshold. We will further discuss such practical considerations in Section 13.4, when comparing the performance of various sorting algorithms. 556 Chapter 13. Sorting and Selection 13.3 Studying Sorting through an Algorithmic Lens Recapping our discussions on sorting to this point, we have described several meth- ods with either a worst case or expected running time of O(n log n) on an input se- quence of size n. These methods include merge-sort and quick-sort, described in this chapter, as well as heap-sort (Section 9.4.2). In this section, we will study sort- ing as an algorithmic problem, addressing general issues about sorting algorithms. 13.3.1 Lower Bound for Sorting A natural first question to ask is whether we can sort any faster than O(n log n) time. Interestingly, if the computational primitive used by a sorting algorithm is the comparison of two elements, this is in fact the best we can do—comparison-based sorting has an Ω(n log n) worst-case lower bound on its running time. (Recall the notation Ω(·) from Section 4.3.1.) To focus on the main cost of comparison-based sorting, let us only count comparisons, for the sake of a lower bound.
  • Book cover image for: Data Structures and Algorithms in Java
    • Michael T. Goodrich, Roberto Tamassia, Michael H. Goldwasser(Authors)
    • 2014(Publication Date)
    • Wiley
      (Publisher)
    For larger data sets, the median of more than three potential pivots might be computed. Hybrid Approaches Although quick-sort has very good performance on large data sets, it has rather high overhead on relatively small data sets. For example, the process of quick- sorting a sequence of eight elements, as illustrated in Figures 12.10 through 12.12, involves considerable bookkeeping. In practice, a simple algorithm like insertion- sort (Section 7.6) will execute faster when sorting such a short sequence. It is therefore common, in optimized sorting implementations, to use a hybrid approach, with a divide-and-conquer algorithm used until the size of a subsequence falls below some threshold (perhaps 50 elements); insertion-sort can be directly invoked upon portions with length below the threshold. We will further discuss such practical considerations in Section 12.4, when comparing the performance of various sorting algorithms. 556 Chapter 12. Sorting and Selection 12.3 Studying Sorting through an Algorithmic Lens Recapping our discussions on sorting to this point, we have described several meth- ods with either a worst case or expected running time of O(n log n) on an input se- quence of size n. These methods include merge-sort and quick-sort, described in this chapter, as well as heap-sort (Section 9.4.2). In this section, we will study sort- ing as an algorithmic problem, addressing general issues about sorting algorithms. 12.3.1 Lower Bound for Sorting A natural first question to ask is whether we can sort any faster than O(n log n) time. Interestingly, if the computational primitive used by a sorting algorithm is the comparison of two elements, this is in fact the best we can do—comparison-based sorting has an Ω(n log n) worst-case lower bound on its running time. (Recall the notation Ω(·) from Section 4.3.1.) To focus on the main cost of comparison-based sorting, let us only count comparisons, for the sake of a lower bound.
  • Book cover image for: Data Structures and Algorithms in C++
    • Michael T. Goodrich, Roberto Tamassia, David M. Mount(Authors)
    • 2011(Publication Date)
    • Wiley
      (Publisher)
    They are initialized to a and b − 1, respectively. During each round, elements that are on the wrong side of the pivot are swapped with each other, until these markers bump into each other. Much of the efficiency of quick-sort depends on how the pivot is chosen. As we have seen, quick-sort is most efficient if the pivot is near the middle of the subvector being sorted. Our choice of setting the pivot to the last element of the subvector relies on the assumption that the last element is reflective of the median key valye. A better choice, if the subvector is moderately sized, is to select the pivot as the median of three values, taken respectively from the front, middle, and tail of the array. This is referred to as the median-of-three heuristic. It tends to perform well in practice, and is faster than selecting a random pivot through the use of a random-number generator. 526 Chapter 11. Sorting, Sets, and Selection 11.3 Studying Sorting through an Algorithmic Lens Recapping our discussions on sorting to this point, we have described several meth- ods with either a worst-case or expected running time of O(n log n) on an input se- quence of size n. These methods include merge-sort and quick-sort, described in this chapter, as well as heap-sort (Section 8.3.5). In this section, we study sorting as an algorithmic problem, addressing general issues about sorting algorithms. 11.3.1 A Lower Bound for Sorting A natural first question to ask is whether we can sort any faster than O(n log n) time. Interestingly, if the computational primitive used by a sorting algorithm is the comparison of two elements, then this is, in fact, the best we can do—comparison- based sorting has an Ω(n log n) worst-case lower bound on its running time. (Recall the notation Ω(·) from Section 4.2.3.) To focus on the main cost of comparison- based sorting, let us only count comparisons, for the sake of a lower bound.
  • Book cover image for: Data Structures and Algorithms in Python
    • Michael T. Goodrich, Roberto Tamassia, Michael H. Goldwasser(Authors)
    • 2013(Publication Date)
    • Wiley
      (Publisher)
    In practice, another common technique for choosing a pivot is to use the median of tree values, taken respectively from the front, middle, and tail of the array. This median-of-three heuristic will more often choose a good pivot and computing a median of three may require lower overhead than selecting a pivot with a random number generator. For larger data sets, the median of more than three potential pivots might be computed. Hybrid Approaches Although quick-sort has very good performance on large data sets, it has rather high overhead on relatively small data sets. For example, the process of quick- sorting a sequence of eight elements, as illustrated in Figures 12.10 through 12.12, involves considerable bookkeeping. In practice, a simple algorithm like insertion- sort (Section 7.5) will execute faster when sorting such a short sequence. It is therefore common, in optimized sorting implementations, to use a hybrid approach, with a divide-and-conquer algorithm used until the size of a subsequence falls below some threshold (perhaps 50 elements); insertion-sort can be directly invoked upon portions with length below the threshold. We will further discuss such practical considerations in Section 12.5, when comparing the performance of various sorting algorithms. 562 Chapter 12. Sorting and Selection 12.4 Studying Sorting through an Algorithmic Lens Recapping our discussions on sorting to this point, we have described several meth- ods with either a worst case or expected running time of O(n log n) on an input se- quence of size n. These methods include merge-sort and quick-sort, described in this chapter, as well as heap-sort (Section 9.4.2). In this section, we study sorting as an algorithmic problem, addressing general issues about sorting algorithms. 12.4.1 Lower Bound for Sorting A natural first question to ask is whether we can sort any faster than O(n log n) time.
  • Book cover image for: Python for Everyone
    • Cay S. Horstmann, Rance D. Necaise(Authors)
    • 2020(Publication Date)
    • Wiley
      (Publisher)
    These important issues are often revisited in later computer science courses. EXAMPLE CODE See sec05/mergetimer.py in your eText or companion code for a program that times the merge sort algorithm. Special Topic 12.3 The Quicksort Algorithm Quicksort is a commonly used algorithm that has the advantage over merge sort that no tem- porary lists are required to sort and merge the partial results. The quicksort algorithm, like merge sort, is based on the strategy of divide and conquer. To sort a range values[start] . . . values[to] of the list values, first rearrange the elements in the range so that no element in the range values[start] . . . values[p] is larger than any element in the range values[p + 1] . . . values[to]. This step is called partitioning the range. For example, suppose we start with a range 5 3 2 6 4 1 3 7 Here is a partitioning of the range. Note that the partitions aren’t yet sorted. 3 3 2 1 4 6 5 7 You’ll see later how to obtain such a partition. In the next step, sort each partition, by recur- sively applying the same algorithm on the two partitions. That sorts the entire range, because Merge sort is an O(n log(n)) algorithm. The n log(n) function grows much more slowly than n 2 . In quicksort, one partitions the elements into two groups, holding the smaller and larger elements. Then one sorts each group. © Christopher Futcher/iStockphoto. 12.5 Analyzing the Merge Sort Algorithm 539 the largest element in the first partition is at most as large as the smallest element in the second partition. 1 2 3 3 4 5 6 7 Quicksort is implemented recursively as follows: def quickSort(values, start, to) : if start >= to : return p = partition(values, start, to) quickSort(values, start, p) quickSort(values, p + 1, to) Let us return to the problem of partitioning a range. Pick an element from the range and call it the pivot. There are several variations of the quicksort algorithm.
  • Book cover image for: A Practical Approach to Data Structure and Algorithm with Programming in C
    • Akhilesh Kumar Srivastava(Author)
    • 2019(Publication Date)
    • Arcler Press
      (Publisher)
    Sorting Algorithms 67 swap(&A[j], &A[j+1]); } /**************************************/ void Traverse(int A[ ], int n){ for (int i=1; i <= n; i++) { printf(“%dt”, A[i–1]); } } /**************************************/ int main(int argc, constchar * argv[ ]) { int A[ ]={5, 2, 3, 4, 9, 8, 7, 6, 1, 2}; int B[ ]={5, 2, 3, 4, 9, 8, 7, 6, 1, 2}; int C[ ]={5, 2, 3, 4, 9, 8, 7, 6, 1, 2}; BubbleSort(A, 10); printf(“nbubble sortn”); Traverse(A, 10); InsertionSort(B, 10); printf(“Insertion Sortn”); Traverse(B, 10); SelectionSort(C, 10); printf(“Selection Sortn”); Traverse(C, 10); return0;} /**************************************/ 6.4. Quick Sort Quicksort is a divide and conquer algorithm which relies on a partition operation. To partition an array, an element called a pivot is selected. All elements smaller than the pivot are moved before it, and all greater elements are moved after it. The Left and Right sublists are then recursively sorted using the same algorithm. • Partition • In partition, it is assumed that the last element is greater than all other elements in the array. In the first run to Partition algorithm, the last element is specifically taken larger than all other elements; however, in the subsequent run, the last element automatically becomes larger than A Practical Approach to Data Structure and Algorithm with Programming in C 68 the set considered. • Usually, the first element is picked as pivot. All other elements are compared with this element. • Purpose of the Partition is to find the appropriate position of pivot in the sorted array. • For the above purpose, smaller elements than pivot are kept on the Left part of the array and larger on the Right side. • Once the appropriate position of pivot in the sorted array is found, it is returned and Quick Sort is called in Left and Right both halves.
  • Book cover image for: Objects, Abstraction, Data Structures and Design
    • Elliot B. Koffman, Paul A. T. Wolfgang(Authors)
    • 2012(Publication Date)
    • Wiley
      (Publisher)
    Analysis of Quicksort If the pivot value is a random value selected from the current sequence, then statis- tically it is expected that half of the items in the sequence will be less than the pivot value and half will be greater than the pivot value. After partitioning, if the left and 75 77 55 64 77 75 12 23 33 43 44 55 64 77 75 10.9 Quicksort 605 right sequences have the same number of elements (the best case), there will be log n levels of recursion. At each level, the partitioning process involves moving every element into its correct partition, so quicksort is O(n log n), just like merge sort. But what if the split is not 50-50? Let us consider the case where each split is 90-10. Instead of a 100-element array being split into two 50-element arrays, there will be one array with 90 elements and one with just 10. The 90-element array may be split 50-50, or it may also be split 90-10. In the latter case, there would be one array with 81 elements and one with just 9 elements. Generally, for random input, the splits will not be exactly 50-50, but neither will they all be 90-10. An exact analysis is dif- ficult and beyond the scope of this book, but the running time will be bound by a constant  n log n. There is one situation, however, where quicksort gives very poor behavior. If, each time we partition the array, we end up with a subarray that is empty, the other sub- array will have one less element than the one just split (only the pivot value will be removed). Therefore, we will have n levels of recursive calls (instead of log n), and the algorithm will be O(n 2 ). Because of the overhead of recursive function calls (ver- sus iteration), quicksort will take longer and require more extra storage on the run- time stack than any of the earlier quadratic algorithms for this particular case. We will discuss a way to handle this situation later. Code for Quicksort Listing 10.9 shows file QuickSort.h with function quick_sort.
  • Book cover image for: Sorting
    eBook - PDF

    Sorting

    A Distribution Theory

    8 Sample Sort The term sample sorts refers to a class of algorithms that are based on Quick Sort where a sampling stage precedes application of Quick Sort. The pathological cases of Quick Sort, with Ω(η 2 ) behavior, arise when the splitting pivot does not split the given list near its middle at most recursive levels, as, for example, in the case of an already sorted list of numbers. The idea in sample sort algorithms is to avoid such pathological splitting by taking a sample from the list and using it to produce a situation favorable to middle splitting with probability higher than that of the chance of a single pivot to split near the middle. This preprocessing stage helps Quick Sort reach its speed potential. Some sample sort variations take a small sample. The median of the small sam-ple is employed as the splitting pivot to speed up Quick Sort. The sampling is then reapplied at each level of recursion. Another approach to sampling as an aid to speeding up Quick Sort uses a large sample once at the top level of recursion. The elements of the sample are then inserted in their proper places, producing a large number of random segments, each handled by standard Quick Sort. In either case, Quick Sort'S behavior is improved but the algorithm becomes more complex. 8.1 THE SMALL SAMPLE ALGORITHM In the small sample approach to sorting n -»■ oo elements, a fixed-size sample is chosen. The median of the sample is then used as a pivot. Recursively, at each level the sampling is reapplied. Median selection is not always the easiest task. We saw instances of median-finding algorithms based on adaptations of some standard sorting algorithms. The point of a small sample approach is to avoid complicating sample sort by burdening the median finding stage. If the sample size is 2k + 1 for some small fixed k, simple median finding algorithms may be used.
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.