## What Is an AVL Tree?

Adelson-Velskii and Landis are the people who discovered it, so the name came from their names, i.e., AVL. It is commonly referred to as a height binary tree. An AVL tree has one of the following characteristics at each node.

A node is referred to as "left heavy" if the longest path in its left subtree is longer than the longest path in its right subtree.

A node is referred to as "right heavy" if the longest path in its right subtree is longer than the longest path in its left subtree.

A node is said to be balanced if the longest paths in the right and left subtrees are equal.

The AVL tree is a height-balanced tree where each node's right and left subtree height differences are either -1, 0, or 1. A factor known as the balance factor keeps the subtrees' height differences apart. As a result, we can define AVL as a balanced binary search tree in which each node's balance factor is either -1, 0, or +1. In this case, the formula follows to determine the balancing factor:

## Balance factor properties -

• If a subtree has a balance factor greater than zero, it is said to be left-heavy because the height of the left subtree is greater than the height of the right subtree, implying that the left subtree contains more nodes than the right subtree.
• Because the height of the left subtree is smaller than the height of the right subtree and because the right subtree has more nodes than the left subtree, a subtree with a balancing factor of 0 is said to be right-heavy.
• If the balance factor is zero, the subtree is perfectly balanced, with equal heights in both the left and right subtrees.

### The AVL Tree's rotations:

If any tree node falls out of balance, the necessary rotations are carried out to correct the imbalance. Four different rotational kinds are possible and are used under various scenarios as follows:

LL Rotation: When a new node is added as the left child of the unbalanced node's left subtree.

RR Rotation: When a new node is added as the right child of the right subtree of an unbalanced node, this process is known as RR Rotation.

LR Rotation: When a new node is added as the left child of an unbalanced node's right subtree.

RL Rotation: When a new node is added as the right child of an unbalanced node's left subtree.

## Binary Search Tree and AVL Tree:

AnAVL tree is a kind of binary searchtree It has some nice complexity guarantees, such as O (log n) search, insertions, and deletions. Structurally it is just a binary search tree and only uses some algorithms (and an additional field) to guarantee the complexity.

Abinary search tree is a kind of binary tree. They both have nodes with at most two children. In a binary search tree, a node can have two children, usually named left and right, where the value stored at left is smaller (or maybe equal) to the value at right. This implies that all values at the left subtree are smaller or equal to those on the right subtree.

## Importance of AVL Tree:

AVL Trees are self-balancing Binary Search Trees (BSTs). A normal BST may be skewed to either side, which will result in a much greater effort to search for a key (the order will be much more than O (log2n) and sometimes equal O(n) at times giving a worst-case result which is the same effort spent in sequential searching. This makes BSTs useless as a practically efficient Data Structure.

Self-balancing Binary tree versions similar to AVL trees are the solution. The Tree will determine whether there is a depth difference between its left and right subtrees and rotate some pivot nodes to the left, right, or both, thereby keeping the depth difference at 1. Every insertion undergoes balancing in order to maintain a constant level of insertion complexity. However, since every node will now carry a piece of depth baggage, there is a further increase in the space needed. Look at how quickly the data can be recalled and how practically it is implemented. Usually, the complexity will be at O(log2n).

Searching can be done by Binary Search Tree (BST), but if the element is already sorted, BST takes the form of a chain, and the time complexity is O(n). To overcome this problem, an AVL tree is introduced.

In the average and worst scenarios, lookup, insertion, and deletion require O (log n) time, where n is the number of nodes in the Tree before the operation. The Tree may need to undergo one or more tree rotations to restore balance after insertions and deletions.

Red-black and AVL trees support the same set of operations and have a basic operation time of O (log n), which is why they are frequently compared. Because they are more carefully balanced than red-black trees, AVL trees are quicker for lookup-intensive applications. AVL trees are height balanced, like red-black trees.

Due to the balance factor, it never takes the form of a chain the AVL tree is a nearly complete Binary tree.

Operations like Insertion, Deletion, and searching in a tree take O (log n) Time in the worst case and Average case.

Example:

Consider the following trees

Figure 1: BST

Figure 2:AVL Tree

In order to insert a node with a key Q in the binary Tree, the algorithm requires 7 comparisons, but if you insert the same key in the AVL tree, from the above 2nd figure, you can see that the algorithm will require 3 comparisons.

Even searching and deletion take more time in BST than in AVL.

• Consider you want to insert 8 in the tree -
• BST Comparisons - 7 comparisons to insert 8.
• AVL-Tree - 3 comparisons to insert 8.
• Consider you want to search for 6:
• BST - After 5 comparisons you get 6.
• AVL - After 1 comparison you get 6.

When the nodes are arranged in increasing or decreasing order, the height of a BST is O(n). As a result, all operations take O(n) time, which is the worst case.

The AVL tree balances itself so that its height is always O(log n) by preventing skewing and thus ensuring that the upper bound for all operations, whether insertion, deletion, or search, is O (log n) .

Time -Complexity (Worst-Case):

• AVL Tree
• Insertion - O(log(n))
• Deletion - O(log(n))
• Searching - O(log(n))
• BST
• Insertion - O(n)
• Deletion - O(n)
• Searching - O(n)

• AVL trees can self-balance: One of the primary concerns of computer science professionals is ensuring that their trees are balanced, and AVL trees have a higher likelihood of being balanced. An unbalanced tree means operations will take longer to complete, resulting in time-consuming lookup applications. The longer it takes to balance a tree, the longer the search will take. An AVL tree, also known as a self-balancing binary search tree, can perform three major operations: search, insert, and delete.
• It is not skewed in any way.
• To insert or delete a node will take low time complexity.
• It also provides faster search operations: The most important benefit of AVL trees is that they perform faster search operations than BSTs, red-black trees, etc. This means that users can complete their tasks much faster than if they used other search operations. This is typically required for coding to ensure that projects are completed on time and in a reliable manner.
• AVL tree also have the Balancing capabilities with a different type of rotation
• It performs faster searches than Red-Black Trees.
• Better searching time complexity than other trees, such as the binary Tree.
• Height must not be greater than log(N), where N is the total number of nodes in the Tree.

## AVL Tree Applications:

• AVL trees are applied in the following situations:
• There are few insertion and deletion operations
• Short search time is needed
• Input data is sorted or nearly sorted
• It is used to index large amounts of data in a database and to search through it efficiently.
• For all kinds of in-memory collections, such as sets and dictionaries, we use AVL Trees.
• Applications for databases where insertions and deletions are infrequent but frequent data searches are necessary.
• Software that requires improved search.
• It is used in corporate settings and storyline games.

Next Topic#