Computer Science
Quicksort Python
Quicksort is a sorting algorithm that uses a divide-and-conquer approach to sort an array or list. In Python, the algorithm works by selecting a pivot element and partitioning the list into two sub-lists, one with elements smaller than the pivot and the other with elements larger than the pivot. The process is repeated recursively until the entire list is sorted.
Written by Perlego with AI-assistance
Related key terms
1 of 5
6 Key excerpts on "Quicksort Python"
- eBook - PDF
- Cay S. Horstmann, Rance D. Necaise(Authors)
- 2020(Publication Date)
- Wiley(Publisher)
These important issues are often revisited in later computer science courses. EXAMPLE CODE See sec05/mergetimer.py in your eText or companion code for a program that times the merge sort algorithm. Special Topic 12.3 The Quicksort Algorithm Quicksort is a commonly used algorithm that has the advantage over merge sort that no tem- porary lists are required to sort and merge the partial results. The quicksort algorithm, like merge sort, is based on the strategy of divide and conquer. To sort a range values[start] . . . values[to] of the list values, first rearrange the elements in the range so that no element in the range values[start] . . . values[p] is larger than any element in the range values[p + 1] . . . values[to]. This step is called partitioning the range. For example, suppose we start with a range 5 3 2 6 4 1 3 7 Here is a partitioning of the range. Note that the partitions aren’t yet sorted. 3 3 2 1 4 6 5 7 You’ll see later how to obtain such a partition. In the next step, sort each partition, by recur- sively applying the same algorithm on the two partitions. That sorts the entire range, because Merge sort is an O(n log(n)) algorithm. The n log(n) function grows much more slowly than n 2 . In quicksort, one partitions the elements into two groups, holding the smaller and larger elements. Then one sorts each group. © Christopher Futcher/iStockphoto. 12.5 Analyzing the Merge Sort Algorithm 539 the largest element in the first partition is at most as large as the smallest element in the second partition. 1 2 3 3 4 5 6 7 Quicksort is implemented recursively as follows: def quickSort(values, start, to) : if start >= to : return p = partition(values, start, to) quickSort(values, start, p) quickSort(values, p + 1, to) Let us return to the problem of partitioning a range. Pick an element from the range and call it the pivot. There are several variations of the quicksort algorithm. - eBook - PDF
- Michael T. Goodrich, Roberto Tamassia, Michael H. Goldwasser(Authors)
- 2013(Publication Date)
- Wiley(Publisher)
With that said, it remains important to have a deep understanding of sorting algorithms. Most immediately, when calling the built-in function, it is good to know what to expect in terms of efficiency and how that may depend upon the initial order of elements or the type of objects that are being sorted. More generally, the ideas and approaches that have led to advances in the development of sorting algorithm carry over to algorithm development in many other areas of computing. We have introduced several sorting algorithms already in this book: • Insertion-sort (see Sections 5.5.2, 7.5, and 9.4.1) • Selection-sort (see Section 9.4.1) • Bubble-sort (see Exercise C-7.38) • Heap-sort (see Section 9.4.2) In this chapter, we present four other sorting algorithms, called merge-sort, quick-sort, bucket-sort, and radix-sort, and then discuss the advantages and disad- vantages of the various algorithms in Section 12.5. 1 In Section 12.6.1, we will explore another technique used in Python for sorting data according to an order other than the natural order defined by the < operator. 538 Chapter 12. Sorting and Selection 12.2 Merge-Sort 12.2.1 Divide-and-Conquer The first two algorithms we describe in this chapter, merge-sort and quick-sort, use recursion in an algorithmic design pattern called divide-and-conquer. We have already seen the power of recursion in describing algorithms in an elegant manner (see Chapter 4). The divide-and-conquer pattern consists of the following three steps: 1. Divide: If the input size is smaller than a certain threshold (say, one or two elements), solve the problem directly using a straightforward method and return the solution so obtained. Otherwise, divide the input data into two or more disjoint subsets. 2. Conquer: Recursively solve the subproblems associated with the subsets. 3. Combine: Take the solutions to the subproblems and merge them into a so- lution to the original problem. - eBook - PDF
Fundamentals of Python
First Programs
- Kenneth Lambert(Author)
- 2018(Publication Date)
- Cengage Learning EMEA(Publisher)
All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. Searching, Sorting, and Complexity Analysis C H A P T E R 1 1 414 Although no elements are exchanged, the total number of partitions is n – 1 and the total number of comparisons performed is 2 n n 1 2 2 1 2 , the same number as in selection sort and bubble sort. Thus, in the worst case, the quicksort algorithm is n O( ) 2 . If quicksort is implemented as a recursive algorithm, analysis must also consider memory usage for the call stack. Each recursive call requires a constant amount of memory for a stack frame, and there are two recursive calls after each partition. Thus, memory usage is O(log n ) in the best case and O( n ) in the worst case. Although the worst-case performance of quicksort is rare, programmers certainly prefer to avoid it. Choosing the pivot at the first or last position is not a wise strategy. Other meth-ods of choosing the pivot, such as selecting a random position or choosing the median of the first, middle, and last elements, can help to approximate O( n log n ) performance in the average case. Implementation of Quicksort The quicksort algorithm is most easily coded using a recursive approach. The following script defines a top-level quicksort function for the client, a recursive quicksortHelper function to hide the extra arguments for the end points of a sublist, and a partition function. The script runs quicksort on a list of 20 randomly ordered integers. def quicksort (lyst): Sorts the items in lyst in ascending order. - Dr. Basant Agarwal(Author)
- 2022(Publication Date)
- Packt Publishing(Publisher)
Quicksort is an efficient sorting algorithm. The quicksort algorithm is based on the divide-and-conquer class of algorithms, similar to the merge sort algorithm, where we break (divide) a problem into smaller chunks that are much simpler to solve, and further, the final results are obtained by combining the outputs of smaller problems (conquer).The concept behind quicksorting is partitioning a given list or array. To partition the list, we first select a data element from the given list, which is called a pivot element.We can choose any element as a pivot element in the list. However, for the sake of simplicity, we’ll take the first element in the array as the pivot element. Next, all the elements in the list are compared with this pivot element. At the end of first iteration, all the elements of the list are arranged in such a way that the elements which are less than the pivot element are arranged to the left of the pivot, that the elements that are greater than the pivot element are arranged to the right of the pivot.Now, let’s understand the working of the quicksort algorithm with an example.In this algorithm, firstly we partition the given list of unsorted data elements into two sublists in such a way that all the elements on the left side of that partition point (also called a pivot) should be smaller than the pivot, and all the elements on the right side of the pivot should be greater. This means that elements of the left sublist and the right sublist will be unsorted, but the pivot element will be at its correct position in the complete list. This is shown in Figure 11.16 .Therefore, after the first iteration of the quicksort algorithm, the chosen pivot point is placed in the list at its correct position, and after the first iteration, we obtain two unordered sublists and follow the same process again on these two sublists. Thus, the quicksort algorithm partitions the list into two parts and recursively applies the quicksort algorithm to these two sublists to sort the whole list:- eBook - PDF
Fundamentals of Python
Data Structures
- Kenneth Lambert(Author)
- 2018(Publication Date)
- Cengage Learning EMEA(Publisher)
Implementation of Quicksort The quicksort algorithm is most easily coded using a recursive approach. The following script defines a top-level quicksort function for the client, a recursive quicksortHelper function to hide the extra arguments for the endpoints of a sublist, and a partition function. The script runs quicksort on a list of 20 randomly ordered integers. def quicksort(lyst): quicksortHelper(lyst, 0, len(lyst) - 1) def quicksortHelper(lyst, left, right): if left < right: pivotLocation = partition(lyst, left, right) quicksortHelper(lyst, left, pivotLocation - 1) quicksortHelper(lyst, pivotLocation + 1, right) def partition(lyst, left, right): # Find the pivot and exchange it with the last item middle = (left + right) // 2 pivot = lyst[middle] lyst[middle] = lyst[right] lyst[right] = pivot # Set boundary point to first position boundary = left # Move items less than pivot to the left for index in range(left, right): if lyst[index] < pivot: swap(lyst, index, boundary) boundary += 1 # Exchange the pivot item and the boundary item swap (lyst, right, boundary) return boundary # Earlier definition of the swap function goes here import random Copyright 2019 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it. 74 Searching, Sorting, and Complexity Analysis C H A P T E R 3 def main(size = 20, sort = quicksort): lyst = [] for count in range(size): lyst.append(random.randint(1, size + 1)) print(lyst) sort(lyst) print(lyst) if __name__ == "__main__": main() Merge Sort Another algorithm called merge sort employs a recursive, divide-and-conquer strategy to break the n O( ) 2 barrier. - eBook - PDF
Algorithms and Theory of Computation Handbook, Volume 1
General Concepts and Techniques
- Mikhail J. Atallah, Marina Blanton(Authors)
- 2009(Publication Date)
- Chapman and Hall/CRC(Publisher)
It is a very good choice for an internal sorting algorithm. Sorting by selection with an array having a heap property is also used for external sorting. 3.3.1.4 Quicksort For many applications a more realistic measure of the time complexity of an algorithm is its expected time. In sorting, a classical example is quicksort [19,29], which has an optimal expected time complexity of O ( n log n ) under the decision tree model, while there are sequences that force it to perform Ω ( n 2 ) operations (in other words, its worst-case time complexity is quadratic). If the 3 -10 General Concepts and Techniques worst-case sequences are very rare, or the algorithm exhibits small variance around its expected case, then this type of algorithm is suitable in practice. Several factors have made quicksort a very popular choice for implementing a sorting routine. Algorithm quicksort is a simple divide-and-conquer concept; the partitioning can be done in a very short loop and is also conceptually simple; its memory requirements can be guaranteed to be only logarithmic on the size of the input; the pivot can be placed in a register; and, most importantly, the expected number of comparisons is almost half of the worst-case optimal competitors, most notably heapsort. In their presentation, many introductory courses on algorithms favor quicksort. However, it is very easy to implement quicksort in a way that it seems correct and extremely efficient for many sorting situations. But, it may be hiding O ( n 2 ) behavior for a simple case (for example, sorting n equal keys). Users of such library routines will be satisfied initially, only to find out later that on something that seems a simple sorting task, the implementation is consuming too much time to finish the sort. Fine tuning of quicksort is a delicate issue [5]. Many of the improvements proposed may be compensated by reduced applicability of the method or more fragile and less clear code.
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.





