Array |
- |
$O(1)$ |
$O(n)$ |
$O(n)$ |
Allows random access. |
List |
|
|
|
|
|
|
Linked list |
$O(n)$ |
$O(1)$ |
$O(1)$ |
Each node has a pointer to the next one. |
|
Doubly linked list |
$O(n)$ |
$O(1)$ |
$O(1)$ |
Each node has a pointer to the next and previous one. |
|
Circular linked list |
$O(n)$ |
$O(1)$ |
$O(1)$ |
Head and tail are linked. |
Hash Table |
- |
$O(1)$ |
$O(1)$ |
$O(1)$ |
Assuming a good hash function is used and the table is not too full. In the worst case, where many collisions happen, time complexity can be $O(n)$. |
Set |
- |
- |
- |
- |
- |
|
Using hash tables. |
$O(1) |
$O(1)$ |
$O(1)$ |
The worst-case of some operations, such as finding the maximum or minimum element, can be O(n) if the implementation does not keep track of these values separately. |
|
Using BST such as AVL, red-black, or splay tree. |
$O(log\ n)$ |
$O(log\ n)$ |
$O(log\ n)$ |
^ |
Stack |
- |
- |
- |
- |
- |
|
Using linked lists. |
$O(1)$ (Last) |
$O(1)$ (Push) |
$O(1)$ (Pop) |
The time complexity of the basic operations (LIFO) is constant, regardless of the number of elements. |
|
Using fixed-size arrays. |
$O(1)$ (Last) |
$O(n)$ (Push) |
$O(1)$ (Pop) |
Insertion may be proportional to the size of the Stack if an new array allocation is needed. |
Queue |
- |
- |
- |
- |
- |
|
Using linked lists. |
$O(1)$ (First) |
$O(1)$ (Enqueue) |
$O(1)$ (Dequeue) |
The time complexity of the basic operations (FIFO) is constant, regardless of the number of elements. |
|
Priority Queue |
$O(1)$ (First) |
$O(log\ n)$ (Enqueue) |
$O(1)$ (Dequeue) |
Using a Binary Heap, we can get the element with highest priority in $O(1)$ and insertions in $O(log\ n)$. |
Tree |
- |
- |
- |
- |
- |
|
Binary Tree |
$O(n)$ |
$O(n)$ |
$O(n)$ |
|
|
Binary Search Tree |
$O(log n)$ (average) |
$O(log n)$ (average) |
$O(log n)$ (average) |
Worst case is $O(n)$ when BST is a linear chain of $n$ nodes. |
|
Red-Black Tree |
$O(log n)$ |
$O(log n)$ |
$O(log n)$ |
Self-balancing guarantees the height of the tree to remain logarithmic with respect to the number of keys. |
|
Trie (Prefix tree) |
$O(m)$ |
$O(m)$ |
$O(m)$ |
Where $m$ is the length of the string. |
|
B-Trees |
$O(log n)$ |
$O(log n)$ |
$O(log n)$ |
The height of the three remains logarithmic with respect to the number of keys. |
|
Interval Tree |
$O(log n + k)$ |
$O(log n + k)$ |
$O(log n + k)$ |
Where $k$ is the number of intervals. |
|
M-ary / K-ary Trees |
$O(log_m n)$ |
$O(log_m n)$ |
$O(log_m n)$ |
Assumes the tree is balanced but in practice, the performance can vary depending on many factors. |
|
Order-Statistic Tree |
$O(log n)$ |
$O(log n)$ |
$O(log n)$ |
It’s an specialized Red-Black Tree and therefore, self-balanced. |
|
Van Emde Boas Tree |
$O(log log n)$ |
$O(log log n)$ |
$O(log log n)$ |
|
Heaps |
- |
- |
- |
- |
- |
|
Binary Min/Max Heap |
$O(1)$ (peek min/max) |
$O(log n)$ |
$O(log n)$ |
|
|
Mergeable Heap |
- |
- |
- |
The time complexity of the basic operations will depend on the underlying heap implementation regardless of the added “merge” operation. |
|
Fibonacci Heap |
$O(1)$ |
$O(1)$ amortized |
$O(log n)$ amortized |
|
Graph |
- |
- |
- |
- |
- |