Wednesday, January 10, 2018

CACHE EVICTION ALGORITHMS : BEST PRACTICES


 

In progress ...

The face book story 

https://research.fb.com/the-evolution-of-advanced-caching-in-the-facebook-cdn/

Cache algorithms : Know How ?










A cache algorithm is a detailed list of instructions that directs which items should be discarded in a computing device's cache of information. 

Least Frequently Used (LFU): This cache algorithm uses a counter to keep track of how often an entry is accessed. With the LFU cache algorithm, the entry with the lowest count is removed first. This method isn't used that often, as it does not account for an item that had an initially high access rate and then was not accessed for a long time.
Least Recently Used (LRU): This cache algorithm keeps recently used items near the top of cache. Whenever a new item is accessed, the LRU places it at the top of the cache. When the cache limit has been reached, items that have been accessed less recently will be removed starting from the bottom of the cache. This can be an expensive algorithm to use, as it needs to keep "age bits" that show exactly when the item was accessed. In addition, when a LRU cache algorithm deletes an item, the "age bit" changes on all the other items.
Adaptive Replacement Cache (ARC):Developed at the IBM Almaden Research Center, this cache algorithm keeps track of both LFU and LRU, as well as evicted cache entries to get the best use out of the available cache.
Most Recently Used (MRU): This cache algorithm removes the most recently used items first. A MRU algorithm is good in situations in which the older an item is, the more likely it is to be accessed. Visiting photogalleries

Cache algorithms : Best Practices

https://vladmihalcea.com/caching-best-practices/

Cache algorithms : Third Party Options

Gauva cache
https://dzone.com/articles/google-guava-cache
Echache 

Cache algorithms : Implementing your own API

https://www.quora.com/What-is-the-best-way-to-Implement-an-LRU-Cache

https://www.careercup.com/question?id=5743058519851008

Citigroup Interview Question for Financial Software Developers


LRU 

http://www.javarticles.com/2012/06/lru-cache.html

https://www.careercup.com/question?id=17230678

https://www.programcreek.com/2013/03/leetcode-lru-cache-java/

http://www.ideserve.co.in/learn/lru-cache-implementation



-
https://gist.github.com/vidz2015/39eeb613a80b54e072481209cffc7551
-

1.Define a map, queue, int maxsize  of your choice , 
2.The put operation is critical, 
  1.  Check if map already has the key ,if so remove it from the queue ,as we need to add the key on the  top
  2.  Iterate over the queue , to get the oldest key ,,remove  it ,add the key & then store value against this key in the map.
      
print 'hello world!'public class ConcurrentLRUCache {
 
private final int maxSize;
private ConcurrentHashMap map;
private ConcurrentLinkedQueue queue;
 
public ConcurrentLRUCache(final int maxSize) {
    this.maxSize = maxSize;
    map = new ConcurrentHashMap(maxSize);
    queue = new ConcurrentLinkedQueue();
}
 
/**
 * 
 */
public void put(final Key key, final Value value) {
    if (map.containsKey(key)) {
        queue.remove(key); // remove the key from the FIFO queue
    }
 
    while (queue.size() >= maxSize) {
        Key oldestKey = queue.poll();
        if (null != oldestKey) {
            map.remove(oldestKey);
        }
    }
    queue.add(key);
    map.put(key, value);
}
 
/**
 * @param key - may not be null!
 * @return the value associated to the given key or null
 */
public Value get(final Key key) {
    return map.get(key);
}

LFU

http://www.javarticles.com/2012/06/lfu-cache.html
http://dhruvbird.com/lfu.pdf
MRU





To do (Check the following Links in spare time)


https://youtu.be/MylI8HmgMBk






Splay Tree :  https://en.wikipedia.org/wiki/Splay_tree
https://www.slideshare.net/xqin74/performance-evaluation-of-traditional-caching-policies-on-a-large-system-with-petabytes-of-data
Hardware cache
http://www.hardwaresecrets.com/how-the-cache-memory-works/

No comments:

Post a Comment