Implementing an LRU (Least Recently Used) cache in Java is a common question asked in interviews for software engineering roles. LRU cache is a popular caching strategy used to improve the performance of frequently accessed data. It maintains a fixed size cache and evicts the least recently used data when it reaches its capacity.
The LRU cache should support the following operations:
Set(key, value)
– set the value for the given key in the cache. If the key already exists, update its value and mark it as the most recently used.Get(key)
– return the value for the given key. If the key does not exist in the cache, return null.Evict()
– remove the least recently used key-value pair from the cache when the cache is full.To implement an LRU cache in Java, we need to use a combination of a hash map and a doubly linked list. The hash map stores the key-value pairs for fast lookup, and the doubly linked list maintains the order of the data inserted, with the most recently used data at the front of the list. When we want to evict data from the cache, we simply remove the tail node of the list, which contains the least recently used data.
Below is the Java code for implementing an LRU cache. We have used a LinkedHashMap
to store the hash map, which maintains the order in which the data was inserted. The doubly linked list is implemented using a custom linked list node class with next
and prev
pointers. The size of the cache is defined as a constant, which is used to evict data when it reaches its capacity.
import java.util.LinkedHashMap;
import java.util.Map;
public class LRUCache {
private static final int CACHE_SIZE = 3;
private final Map cache = new LinkedHashMap<>(CACHE_SIZE, 0.75f, true) {
protected boolean removeEldestEntry(Map.Entry eldest) {
return size() >
CACHE_SIZE;
}
}
;
public void set(K key, V value) {
cache.put(key, value);
}
public V get(K key) {
return cache.getOrDefault(key, null);
}
public void evict() {
Map.Entry eldest = cache.entrySet().iterator().next();
cache.remove(eldest.getKey());
}
}
The implementation is thread-safe, as LinkedHashMap
is not thread-safe by default. By passing the access-order parameter as true while creating a LinkedHashMap
object, we enable its ordering based on access times instead of insertion times.
In this blog post, we have seen how to implement an LRU cache in Java. It is a common interview question, and candidates must be aware of the hash map and doubly linked list data structures. Implementing a thread-safe cache and handling concurrency issues is also crucial in a real-world application.