An implementation of LRU Cache. Lru is a part of cache algorithms (also frequently called cache replacement algorithms or cache replacement policies).
Logic
- Discards the least recently used items first.
- This algorithm requires keeping track of what was used when, which is expensive if one wants to make sure the algorithm always discards the least recently used item.
- General implementations of this technique require keeping "age bits" for cache-lines and track the "Least Recently Used" cache-line based on age-bits.
- In such an implementation, every time a cache-line is used, the age of all other cache-lines changes
Algorithm explanation
For a cache of page frame x:
- Check if the page is present in cache.
- If not present, then check is the cache is full or not:
- If the cache is full, REMOVE the last element from the cache.
- If the element is present in cache, then shift that element to first position in cache from its original position.
- This way you can keep the least recently used elements in the last and most recently used in front of the cache.
Every time a requested page is not found in cache, that is a miss or page fault, and if the page is present in cache, then its a hit.
Data Structure used
- In the algorithm below we used two different data structure, one is linked list and other one is a hash map
- The linked list is used to contain the pages and the hash map contains the pages and their address.
- Every time a new page is requested, we first check in the hash map if the page is present or not.
- If not present, and the cache is full, we simply delete the last entry in the cache.
- If present, we shift that page from its current location to beginning of the cache and update the address in hash map for that page.
- Author
- Nitin Sharma