Computer Science

AVL Tree

AVL tree is a self-balancing binary search tree that maintains a balance factor for each node. The balance factor is the difference between the heights of the left and right subtrees. AVL trees are named after their inventors, Adelson-Velsky and Landis.

Written by Perlego with AI-assistance

11 Key excerpts on "AVL Tree"

  • Book cover image for: Data Structures and Algorithms in Java
    • Michael T. Goodrich, Roberto Tamassia, Michael H. Goldwasser(Authors)
    • 2014(Publication Date)
    • Wiley
      (Publisher)
    11.3. AVL Trees 479 11.3 AVL Trees The TreeMap class, which uses a standard binary search tree as its data structure, should be an efficient map data structure, but its worst-case performance for the various operations is linear time, because it is possible that a series of operations results in a tree with linear height. In this section, we describe a simple balancing strategy that guarantees worst-case logarithmic running time for all the fundamental map operations. Definition of an AVL Tree The simple correction is to add a rule to the binary search-tree definition that will maintain a logarithmic height for the tree. Recall that we defined the height of a subtree rooted at position p of a tree to be the number of edges on the longest path from p to a leaf (see Section 8.1.3). By this definition, a leaf position has height 0. In this section, we consider the following height-balance property, which char- acterizes the structure of a binary search tree T in terms of the heights of its nodes. Height-Balance Property: For every internal position p of T , the heights of the children of p differ by at most 1. Any binary search tree T that satisfies the height-balance property is said to be an AVL Tree, named after the initials of its inventors: Adel’son-Vel’skii and Landis. An example of an AVL Tree is shown in Figure 11.10. 1 4 1 1 2 3 2 1 17 32 48 62 50 78 88 44 Figure 11.10: An example of an AVL Tree. The keys of the entries are shown inside the nodes, and the heights of the nodes are shown above the nodes (all leaves have height 0). 480 Chapter 11. Search Trees An immediate consequence of the height-balance property is that a subtree of an AVL Tree is itself an AVL Tree. The height-balance property also has the important consequence of keeping the height small, as shown in the following proposition. Proposition 11.1: The height of an AVL Tree storing n entries is O(log n).
  • Book cover image for: Data Structures and Algorithms in Java
    • Michael T. Goodrich, Roberto Tamassia, Michael H. Goldwasser(Authors)
    • 2014(Publication Date)
    • Wiley
      (Publisher)
    11.3. AVL Trees 443 11.3 AVL Trees The TreeMap class, which uses a standard binary search tree as its data structure, should be an efficient map data structure, but its worst-case performance for the various operations is linear time, because it is possible that a series of operations results in a tree with linear height. In this section, we describe a simple balancing strategy that guarantees worst-case logarithmic running time for all the fundamental map operations. Definition of an AVL Tree The simple correction is to add a rule to the binary search-tree definition that will maintain a logarithmic height for the tree. Recall that we defined the height of a subtree rooted at position p of a tree to be the number of edges on the longest path from p to a leaf (see Section 8.1.2). By this definition, a leaf position has height 0. In this section, we consider the following height-balance property, which char- acterizes the structure of a binary search tree T in terms of the heights of its nodes. Height-Balance Property: For every internal position p of T , the heights of the children of p differ by at most 1. Any binary search tree T that satisfies the height-balance property is said to be an AVL Tree, named after the initials of its inventors: Adel’son-Vel’skii and Landis. An example of an AVL Tree is shown in Figure 11.10. 1 4 1 1 2 3 2 1 17 32 48 62 50 78 88 44 Figure 11.10: An example of an AVL Tree. The keys of the entries are shown inside the nodes, and the heights of the nodes are shown above the nodes (all leaves have height 0). 444 Chapter 11. Search Tree Structures An immediate consequence of the height-balance property is that a subtree of an AVL Tree is itself an AVL Tree. The height-balance property also has the important consequence of keeping the height small, as shown in the following proposition. Proposition 11.1: The height of an AVL Tree storing n entries is O(log n).
  • Book cover image for: Data Structures And Algorithms
    However, since we do not know order n and the tree may not be balanced, we need considerable time to manipulate a tree with n nodes. Thus in order to optimize search times, we keep the tree balancing at all times. An AVL Tree has 0(log n) in the worst case to manipulate searches, insertions and deletions in a tree with n nodes. 11.3.1. Definition In complete binary tree with node n, we know already that the left and right subtrees of any node have the same height. In this section, we try to make a search tree that the height of every left and right subtree never differs by more than 1 by changing a search tree. Binary Trees 253 Fig. 11.21. An example of an AVL Tree. Definition. An AVL Tree is a binary search tree that is either empty or that consists of two AVL subtree, left and right subtrees, whose heights differ by no more than 1 as follow. | # L -H K = 1 HL is the height of the left subtree and HR is the height of the right subtree. Figure 11.21 shows an example of AVL Trees to satisfy the definition. In Fig. 11.21, LH, RH and EH is a balance factor that means a left-high node, a right-high node and equal, respectively. LH = 1 of root 6 means that left subtree of root 6 has height 1 more than its right subtree. RH = 1 of node 7 means that its right subtree has height 1 more than left subtree. EH = 0 is same height. In Fig. 11.21, since the height of the left subtree (root is 4) is 3 and the height of the right subtree (root is 7) is 2, |i?L — -HRI = 1- Since both subtrees (of 4 and 7) are balanced, the tree is balanced. In the left subtree (root is 4), it is balanced because the height of its subtree differs by only 1. Also since its subtrees (root 2 and 5) are balanced, this sub-tree is a balanced tree. Right subtree (root 7) is balanced, because the height of its subtree differs by only 1. Continuing this, we create a tree balance. Example. Figure 11.22 shows the AVL Tree with only one node or two nodes.
  • Book cover image for: Data Structures Through C
    eBook - ePub

    Data Structures Through C

    Learn the fundamentals of Data Structures through C

    rightflag is a thread. When we reach a link we go to the right child and again follow the same procedure by checking its left sub-tree.
    As we follow these steps we are sometimes likely to reach the head node, and that is the time to stop the procedure.

    AVL Trees

    We know that height of a BST is the maximum number of edges from leaf node to root node. Note that if we change the order of insertion of nodes in a BST, we may get BSTs of different heights. As a confirmation, you may try creating two BSTs using the insertion order as 30, 40, 10, 50, 20, 5, 35 and 50, 40, 35, 30, 20, 10, 5. In the first case you would get a BST of height 2 and in the second case a BST of height 6.
    Also, search time in a BST depends upon its height. Searching is efficient if the heights of both left and right sub-trees of any node are equal. However, frequent insertions and deletions in a BST are likely to make it unbalanced. The efficiency of searching is ideal if the difference between the heights of left and right sub-trees of all the nodes in a BST is at the most one. Such a binary search tree is called a Balanced BST . It was invented in the year 1962 by two Russian mathematicians—G. M. Adelson-Velskii and E. M. Landis. Hence such trees are also known as AVL Trees. Figure 7-15 shows some examples of AVL Trees.
    Figure 7-15. AVL Trees.
    The balance factor of a node is calculated as height of the left sub-tree minus height of the right sub-tree of the node. The balance factor of any node in an AVL BST should be -1, 0 or 1. If it is other than these three values then the tree is not balanced.
    To re-balance and make it an AVL Tree the nodes need to be properly adjusted. This is done by doing one of the 4 types of rotations—Left rotation, Right rotation, Left Right rotation and Right Left rotation. Of these, first two involve a 1 step process, whereas the next two involve a 2 step process.
  • Book cover image for: Data Structures
    eBook - PDF

    Data Structures

    Abstraction and Design Using Java

    • Elliot B. Koffman, Paul A. T. Wolfgang(Authors)
    • 2021(Publication Date)
    • Wiley
      (Publisher)
    Their algorithm keeps track of the difference in height of each subtree. As items are added to (or removed from) the tree, the balance (i.e., the difference in the heights of the subtrees) of each subtree from the insertion point up to the root is updated. If the balance ever gets out of the range −1 ... +1, the subtree is rotated to bring it back into balance. Trees using this approach are known as AVL Trees after the initials of the inventors. As before, we define the height of a tree as the number of nodes in the longest path from the root to a leaf node, including the root. Balancing a Left–Left Tree Figure 9.9 shows a binary search tree with a balance of −2 caused by an insert into its left–left subtree. This tree is called a Left–Left tree because its root and the left subtree of the root are both left-heavy. Each white triangle with label a, b, or c represents a tree of height k; the area in color at the bottom of the left–left triangle (tree a) indicates an insertion into this tree (its height is now k +1). We use the formula h h R L to calculate the balance for each node, where h L and h R are the heights of the left and right subtrees, respectively. The actual heights are not important; it is their relative difference that matters. The right subtree (b) of node 25 has a height of k; its left subtree (a) has a height of k +1, so its balance is −1. The right subtree of node 50 has a height of k; its left subtree has a height of k + 2 (adding one for node 25), so its balance is −2. Figure 9.10 shows this same tree after a rotation right. The new tree root is node 25. Its right subtree (root 50) now has tree b as its left subtree. Note that balance has now been achieved. Also, the overall height has not increased. Before the insertion, the tree height was k + 2; after the rotation, the tree height is still k + 2. 9.2 AVL Trees 441 Balancing a Left–Right Tree Figure 9.11 shows a left-heavy tree caused by an insert into the Left–Right subtree.
  • Book cover image for: Data Science with Semantic Technologies
    eBook - PDF

    Data Science with Semantic Technologies

    Theory, Practice and Application

    • Archana Patel, Narayan C. Debnath, Bharat Bhusan(Authors)
    • 2022(Publication Date)
    • Wiley-Scrivener
      (Publisher)
    A delicate part of algorithms on binary search trees consists in main- taining them balanced or balancing them from time to time to keep good performance of operations. Recall that balancing the tree makes the cost of finding any key in lg (n). There are many variations of balanced binary search trees. Among the most classic and widely used balanced binary trees we can cite AVL Trees [9], Red-Black trees [10] and AA trees [11]. Among recent balanced binary trees, we can cite weak AVL Trees [12] and zip trees [13]. In all of these methods, tree rebalancing is performed during each update operation. We want through this work to present a new balanced binary search tree allowing to partition nodes of a binary search tree in classes with height h-1 or h-2, h being the parameter of the data structure. Each class holds an AVL Tree. As a consequence, two kinds of nodes are generated: simple and class nodes. We have thus a double balance: globally the tree is perfectly balanced considering only class nodes and locally, i.e., inside a class, tree balance is provided by the AVL Tree technique. One additional byte at the level of any node suffices to represent both its kind and its height. The latter controls balance of the tree inside classes. P(h)-BST: A Data Structure for Computer RAM 141 One of major advantages of the new structure is that it includes Red- Black trees and AVL Trees. Indeed, when h is equal to 2, the new struc- ture generates a binary search tree equivalent to a Red-Black tree. When h reaches a certain value h’, i.e. when one class is created, it is about an AVL Tree. For each value of h between 2 and h’, the new structure generates a new balanced binary search tree unknown until now. Note that two doctoral thesis works focused on the proposed structure. The first [14] proposes a unique representation for AVL and Red-Black tree. The second [15] proposes a generalization of the Red-Black tree structure.
  • Book cover image for: Objects, Abstraction, Data Structures and Design
    • Elliot B. Koffman, Paul A. T. Wolfgang(Authors)
    • 2012(Publication Date)
    • Wiley
      (Publisher)
    We also need to provide functions similar to the ones needed for removal in a binary search tree. Implementing removal is left as a programming project. Also, observe that the effect of rotations is not only to restore balance but to decrease the height of the subtree being rotated. Thus, while only one call to rebal- ance_left or rebalance_right was required for insertion, during removal each recursive return could result in a further need to rebalance. Performance of the AVL Tree Because each subtree is kept as close to balanced as possible, one would expect that the AVL Tree provides the expected O(log n) performance. Each subtree is allowed to be out of balance by ±1. Thus, the tree may contain some holes. It can be shown that in the worst case the height of an AVL Tree can be 1.44 times the height of a complete binary tree that contains the same number of items. However, this would still yield O(log n) performance, because we ignore constants. The worst-case performance is very rare. Empirical tests (see, for example, Donald Knuth, The Art of Computer Programming, Vol 3: Searching and Sorting [Addison- 642 Chapter 11 Self-Balancing Search Trees Wesley, 1973], p. 460) show that, on the average, log 2 n + 0.25 comparisons are required to insert the nth item into an AVL Tree. Thus the average performance is very close to that of the corresponding complete binary search tree. E X E R C I S E S F O R S E C T I O N 1 1 . 2 S E L F - C H E C K 1. Show how the final AVL Tree for the “The quick brown fox . . . dog” changes as you insert “apple”, “cat”, and “hat” in that order. 2. Show the effect of just rotating right on the tree in Figure 11.11. Why doesn’t this fix the problem? 3. Build an AVL Tree that inserts the integers 30, 40, 15, 25, 90, 80, 70, 85, 15, 72 in the given order. 4. Build the AVL Tree from the sentence “Now is the time for all good men to come to the aid of their party.” P R O G R A M M I N G 1.
  • Book cover image for: Swift Data Structure and Algorithms
    • Erik Azar, Mario Eguiluz Alebicto(Authors)
    • 2016(Publication Date)
    • Packt Publishing
      (Publisher)
    -1,1 ], as expected.
    Now that we now how to rebalance the AVL Tree with these for techniques (two simple rotations and two double rotations), let's finish the content of AVL Trees explaining the search and insertion process.

    Search

    Search in the AVL Trees is the same as in a binary search tree. There is no difference when trying to search for a specific node in terms of methodology.

    Insertion

    However, the insertion process is more complicated. AVL Trees are strictly balanced trees, and inserting a new node can break that balance. Once we put the new node in the correct subtree, we need to ensure that the balance factors of its ancestors are correct. This process is called retracing . If there is any balance factor with wrong values (not in the range [-1 , 1 ]), we will use rotations to fix it.
    Let's see what the process looks like at a high level:
    • We insert the new node Z , in a valid AVL Tree, so all the balance factors are in the range [-1, 1 ].
    • We are going to add node Z into a subtree of node X .
    • We go from the bottom of the tree to the subtree, checking the balance factors. If some of them are incorrect, we are going to perform rotations to fix them. Then, we go up again until the balance factors are equal to 0 or until reaching the root.
    Passage contains an image

    Trie tree

    Until now, we have seen different types of trees, such as binary trees, binary search trees, red-black trees, and AVL Trees. In all these types of tree, the content of a node (a value or a key) is not related to the content of a previous node. A single node has a complete meaning, such as a value or number by itself.
    But in some scenarios in real life, we need to store a series of data in which those values have common parts; think of it as the suffixes or prefixes in related words, in the alphabet, in a telephone directory.
    Here is where a trie tree shines. They are ordered data structures where edges contain part of a key and its descendant nodes have common share part of the previous values. Check this example out:
  • Book cover image for: Advanced Data Structures
    3 Balanced Search Trees In the previous chapter, we discussed search trees, giving find, insert, and delete methods, whose complexity is bounded by O(h), where h is the height of the tree, that is, the maximum length of any path from the root to a leaf. But the height can be as large as n; in fact, a linear list can be a correct search tree, but it is very inefficient. The key to the usefulness of search trees is to keep them balanced, that is, to keep the height bounded by O(log n) instead of O(n). This fundamental insight, together with the first method that achieved it, is due to Adel’son-Vel’ski ˘ ı and Landis (1962), who in their seminal paper invented the height-balanced tree, now frequently called AVL Tree. The height-balanced tree achieves a height bound h ≤ 1.44 log n + O(1). Because any tree with n leaves has height at least log n, this is already quite good. There are many other methods that achieve similar bounds, which we will discuss in this chapter. 3.1 Height-Balanced Trees A tree is height-balanced if, in each interior node, the height of the right subtree and the height of the left subtree differ by at most 1. This is the oldest balance criterion for trees, introduced and analyzed by G.M. Adel’son-Vel’ski ˘ ı and E.M. Landis (1962), and still the most popular variant of balanced search trees (AVL Trees). A height-balanced tree has necessarily small height. Theorem. A height-balanced tree of height h has at least  3+ √ 5 2 √ 5   1+ √ 5 2  h −  3− √ 5 2 √ 5   1− √ 5 2  h leaves. A height-balanced tree with n leaves has height at most  log 1+ √ 5 2 n  =  c Fib log 2 n  ≈ 1.44 log 2 n, where c Fib = (log 2 ( 1+ √ 5 2 )) −1 . 50 3.1 Height-Balanced Trees 51 Proof. Let F h denote a height-balanced tree of height h with minimal number of leaves. Either the left or the right subtree of root(F h ) must have height h − 1, and because the tree is height balanced, the other subtree has height at least h − 2.
  • Book cover image for: Algorithm Design and Applications
    • Michael T. Goodrich, Roberto Tamassia(Authors)
    • 2014(Publication Date)
    • Wiley
      (Publisher)
    That is, we have the following theorem. Theorem 4.2: An AVL Tree for n key-element items uses O(n) space and exe- cutes the operations find, insert and remove to each take O(log n) time. Operation Time Structural Changes find O(log n) none insert O(log n) O(1) remove O(log n) O(log n) Table 4.8: Performance of an n-element AVL Tree. The space usage is O(n). 4.2. AVL Trees 125 Algorithm insertAVL(k, e, T ): Input: A key-element pair, (k, e), and an AVL Tree, T Output: An update of T to now contain the item (k, e) v ← IterativeTreeSearch(k, T ) if v is not an external node then return “An item with key k is already in T ” Expand v into an internal node with two external-node children v.key ← k v.element ← e v.height ← 1 rebalanceAVL(v, T ) Algorithm removeAVL(k, T ): Input: A key, k, and an AVL Tree, T Output: An update of T to now have an item (k, e) removed v ← IterativeTreeSearch(k, T ) if v is an external node then return “There is no item with key k in T ” if v has no external-node child then Let u be the node in T with key nearest to k Move u’s key-value pair to v v ← u Let w be v’s smallest-height child Remove w and v from T , replacing v with w’s sibling, z rebalanceAVL(z, T ) Algorithm rebalanceAVL(v, T ): Input: A node, v, where an imbalance may have occurred in an AVL Tree, T Output: An update of T to now be balanced v.height ← 1 + max{v.leftChild().height, v.rightChild().height} while v is not the root of T do v ← v.parent() if |v.leftChild().height − v.rightChild().height| > 1 then Let y be the tallest child of v and let x be the tallest child of y v ← restructure(x) // trinode restructure operation v.height ← 1 + max{v.leftChild().height, v.rightChild().height} Algorithm 4.7: Methods for item insertion and removal in an AVL Tree, as well as the method for rebalancing an AVL Tree.
  • Book cover image for: Data Structures and Algorithms in C++
    • Michael T. Goodrich, Roberto Tamassia, David M. Mount(Authors)
    • 2011(Publication Date)
    • Wiley
      (Publisher)
    10.2. AVL Trees 447 In Code Fragment 10.14, we present the class definition for AVLTree. This class is derived from the class SearchTree, but using our enhanced AVLEntry in order to maintain height information for the nodes of the tree. The class defines a number of typedef shortcuts for referring to entities such as keys, values, and tree positions. The class declares all the standard dictionary public member functions. At the end, it also defines a number of protected utility functions, which are used in maintaining the AVL Tree balance properties. template // an AVL Tree class AVLTree : public SearchTree< AVLEntry > { public: // public types typedef AVLEntry AVLEntry; // an entry typedef typename SearchTree::Iterator Iterator; // an iterator protected: // local types typedef typename AVLEntry::Key K; // a key typedef typename AVLEntry::Value V; // a value typedef SearchTree ST; // a search tree typedef typename ST::TPos TPos; // a tree position public: // public functions AVLTree(); // constructor Iterator insert(const K& k, const V& x); // insert (k,x) void erase(const K& k) throw(NonexistentElement); // remove key k entry void erase(const Iterator& p); // remove entry at p protected: // utility functions int height(const TPos& v) const; // node height utility void setHeight(TPos v); // set height utility bool isBalanced(const TPos& v) const; // is v balanced? TPos tallGrandchild(const TPos& v) const; // get tallest grandchild void rebalance(const TPos& v); // rebalance utility }; Code Fragment 10.14: Class AVLTree, an AVL Tree implementation of a dictionary. Next, in Code Fragment 10.15, we present the constructor and height utility function. The constructor simply invokes the constructor for the binary search tree, which creates a tree having no entries. The function height returns the height of a node, by extracting the height information from the AVLEntry.
Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.