To implement a cache with the given requirements, I have chosen Python as my preferred language due to its simplicity and ease of use. I will be using a dictionary to represent the cache, where the keys will be the items in the cache, and the values will be their respective values. The cache size will be fixed, and we will maintain a doubly linked list to keep track of the items.
The below code implements a cache with the given requirements.
class Node: def __init__(self, key, value): self.key = key self.value = value self.prev = None self.next = Noneclass LRUCache: def __init__(self, capacity): self.capacity = capacity self.dict = {
}
self.head = Node(0, 0) self.tail = Node(0, 0) self.head.next = self.tail self.tail.prev = self.head def _remove(self, node): node.prev.next = node.next node.next.prev = node.prev def _add(self, node): node.prev = self.head node.next = self.head.next self.head.next.prev = node self.head.next = node def get(self, key): if key in self.dict: node = self.dict[key] self._remove(node) self._add(node) return node.value return -1 def put(self, key, value): if key in self.dict: self._remove(self.dict[key]) node = Node(key, value) self._add(node) self.dict[key] = node if len(self.dict) > self.capacity: node = self.tail.prev self._remove(node) del self.dict[node.key]
We have used a Node
class to represent the items in the cache. The LRUCache
class has an internal dictionary dict
to represent the cache. The least recently used item is maintained at the tail of the doubly linked list, and the most recently used item is maintained at the head of the list. If an item is accessed, it is moved to the head of the list. When a new item is added, it is added at the head of the list. If the cache is full, the least recently used item is removed.
The below test cases can be used to verify the implementation of the cache
cache = LRUCache(2)cache.put(1, 1)cache.put(2, 2)cache.get(1) # returns 1cache.put(3, 3) # evicts key 2cache.get(2) # returns -1 (not found)cache.put(4, 4) # evicts key 1cache.get(1) # returns -1 (not found)cache.get(3) # returns 3cache.get(4) # returns 4
For the above test case, The output will be:
-1-134
I chose to implement the cache using Python as it is quick to write clean code, and I am familiar with the language. I chose a dictionary to represent the cache, as it provides fast access to the data. To maintain the order of items, I chose to implement a doubly linked list, where the head represents the most recently used item and the tail the least recently used item.
When a new item is added to the cache, it is added at the head of the linked list. When an item is accessed, it is moved to the head of the linked list. If the cache reaches its maximum size, the least recently used item at the tail of the linked list is removed.