AVL Trees
In the previous lesson, we studied Binary Search Trees (BST) and saw how their performance depends heavily on the shape of the tree.
While BSTs are powerful, they have one major weakness — they can become unbalanced.
This lesson introduces AVL Trees, a self-balancing version of BSTs that guarantees efficient performance.
Why Do We Need AVL Trees?
Recall that in a regular BST, if values are inserted in sorted order, the tree becomes skewed.
In such cases, searching degrades from O(log n) to O(n).
AVL Trees solve this problem by automatically maintaining balance after every insertion and deletion.
What Is an AVL Tree?
An AVL Tree is a Binary Search Tree with an additional rule:
For every node, the difference between the heights of its left and right subtrees must be at most 1.
This difference is called the balance factor.
Balance Factor
The balance factor of a node is defined as:
Height(left subtree) − Height(right subtree)
Allowed values are:
- -1
- 0
- +1
If the balance factor goes beyond this range, the tree must be rebalanced.
How AVL Trees Maintain Balance
AVL Trees use rotations to restore balance.
A rotation is a local restructuring of the tree that preserves the BST property.
Rotations ensure that tree height stays minimal.
Types of Imbalance
There are four classic imbalance cases:
- Left-Left (LL)
- Right-Right (RR)
- Left-Right (LR)
- Right-Left (RL)
Each case has a specific rotation strategy.
Left-Left (LL) Rotation
Occurs when a node becomes unbalanced after insertion into the left subtree of the left child.
This is fixed using a single right rotation.
Right-Right (RR) Rotation
Occurs when insertion happens in the right subtree of the right child.
This is fixed using a single left rotation.
Left-Right (LR) Rotation
Occurs when insertion happens in the right subtree of the left child.
This requires a left rotation followed by a right rotation.
Right-Left (RL) Rotation
Occurs when insertion happens in the left subtree of the right child.
This requires a right rotation followed by a left rotation.
AVL Insertion (Conceptual)
AVL insertion follows three steps:
First, insert the node like a normal BST.
Then, update the height of each ancestor node.
Finally, check the balance factor and rotate if needed.
def get_height(node):
if not node:
return 0
return node.height
def get_balance(node):
if not node:
return 0
return get_height(node.left) - get_height(node.right)
Why AVL Trees Are Efficient
Because AVL Trees strictly control height, the maximum height is always logarithmic.
This guarantees:
Search, insertion, and deletion all run in O(log n) time.
This makes AVL Trees suitable for performance-critical applications.
Real-World Applications
AVL Trees are used in:
- Databases and indexing engines
- Memory management systems
- Real-time applications
- Compiler symbol tables
AVL Trees vs BST
Compared to regular BSTs, AVL Trees offer guaranteed performance but with slightly more overhead.
The tradeoff is worth it when consistent speed is required.
Exercises
Exercise 1:
What is the balance factor of a node?
Exercise 2:
Why are rotations needed in AVL Trees?
Exercise 3:
How many imbalance cases exist in AVL Trees?
Quick Quiz
Q1. Do AVL Trees allow imbalance?
Q2. Are AVL Trees faster than BSTs?
In the next lesson, we will explore Tries, a specialized tree structure used for fast string searching and prefix-based applications.