Introduction
Efficient caching is a crucial part of optimizing software performance. One of the most widely used caching techniques is the Least Recently Used (LRU) Cache. The LRU Cache ensures that the most recently accessed data is retained while the least used data gets removed when the cache reaches its capacity.
In this blog, we will:
Understand the concept of LRU Cache
Learn how to implement LRU Cache in Java
Walk through a step-by-step breakdown of the code
See an example usage scenario
What is an LRU Cache?
An LRU Cache is a fixed-size cache that follows the principle:
Most Recently Used (MRU) items are always at the front
Least Recently Used (LRU) items are always at the back
When the cache exceeds its capacity, the least recently used item is removed
All operations (
get
andput
) must run in O(1) time complexity
Data Structures Used for LRU Cache Implementation
To achieve O(1) time complexity for insertion, deletion, and retrieval, we use:
Doubly Linked List (DLL):
Stores (key, value) pairs
Allows fast removal and insertion from both ends
HashMap (
Map<Integer, Node> cache
):- Maps keys to nodes for O(1) access
Java Implementation of LRU Cache
Below is the Java implementation of an LRU Cache:
import java.util.*;
class LRUCache {
class Node {
int key, value;
Node prev, next;
Node(int k, int v) {
key = k;
value = v;
}
}
private final int capacity;
private final Map<Integer, Node> cache;
private final Node head, tail;
public LRUCache(int capacity) {
this.capacity = capacity;
cache = new HashMap<>();
head = new Node(0, 0);
tail = new Node(0, 0);
head.next = tail;
tail.prev = head;
}
public int get(int key) {
if (!cache.containsKey(key)) return -1;
Node node = cache.get(key);
deleteNode(node);
insertAfterHead(node);
return node.value;
}
public void put(int key, int value) {
if (cache.containsKey(key)) {
Node node = cache.get(key);
node.value = value;
deleteNode(node);
insertAfterHead(node);
} else {
if (cache.size() == capacity) {
Node node = tail.prev;
cache.remove(node.key);
deleteNode(node);
}
Node newNode = new Node(key, value);
cache.put(key, newNode);
insertAfterHead(newNode);
}
}
private void insertAfterHead(Node node) {
node.next = head.next;
node.prev = head;
head.next.prev = node;
head.next = node;
}
private void deleteNode(Node node) {
node.prev.next = node.next;
node.next.prev = node.prev;
}
}
How the Code Works
1. Constructor (LRUCache(int capacity)
)
Initializes the cache with a given capacity.
Creates dummy head and tail nodes to simplify operations.
Uses a HashMap to store key-node mappings for O(1) lookup.
2. get(int key)
Method
If the key is not present, return
-1
.If the key exists:
Move the node to the front (Most Recently Used position).
Return its value.
3. put(int key, int value)
Method
If the key already exists, update its value and move it to the front.
If the key is new:
If the cache is full, remove the Least Recently Used (LRU) item.
Add the new key-value pair to the front.
4. insertAfterHead(Node node)
Method
- Adds a node immediately after
head
(marking it as Most Recently Used).
5. deleteNode(Node node)
Method
- Removes a node from the Doubly Linked List.
Example Usage
Here’s a simple example demonstrating how the LRU Cache works:
public class Main {
public static void main(String[] args) {
LRUCache lru = new LRUCache(2);
lru.put(1, 1); // Cache: {1=1}
lru.put(2, 2); // Cache: {1=1, 2=2}
System.out.println(lru.get(1)); // Returns 1, moves key 1 to the front
lru.put(3, 3); // Removes key 2, Cache: {1=1, 3=3}
System.out.println(lru.get(2)); // Returns -1 (not found)
lru.put(4, 4); // Removes key 1, Cache: {4=4, 3=3}
System.out.println(lru.get(1)); // Returns -1 (not found)
System.out.println(lru.get(3)); // Returns 3
System.out.println(lru.get(4)); // Returns 4
}
}
Cache State at Each Step
Operation | Cache State |
put(1,1) | {1=1} |
put(2,2) | {1=1, 2=2} |
get(1) | {2=2, 1=1} (1 moved to front) |
put(3,3) | {1=1, 3=3} (2 removed) |
get(2) | -1 (Not found) |
put(4,4) | {3=3, 4=4} (1 removed) |
get(1) | -1 (Not found) |
get(3) | 3 |
get(4) | 4 |
Conclusion
Implementing an LRU Cache efficiently requires a combination of HashMap and Doubly Linked List to achieve O(1) operations for insertion, deletion, and retrieval. This Java implementation provides a clean and optimized approach to managing a Least Recently Used cache.
By understanding this, you can optimize memory usage and performance in applications that require frequent lookups with a fixed cache size.
Have any questions? Feel free to drop them in the comments! 🚀