Version 3

    LRU stands for Least Recently Used. LRU is a strategy that specifies which entry to replace if a cache needs a free position for a new entry. With LRU the cache chooses the position, that hasn't been accessed for the longest time compared with all other cache entries. The old entry at this position is replaced with the new entry.




    The cache is set to a maximum size of 5 entries. The cache is empty at the beginning and the following objects are requested from the cache (which makes the cache fetch them from the resource and cache them inside itself):


    A!, B!, C!, B, B, C, E!, D!, E, A, F


    ! means a cache miss that leads to loading the object from the resource


      means a cache hit, the object is served directly from the cache


    At the time F is requested from the cache it needs to be fetched from the resource and put into the cache. Since the cache is at its size limit one cache entry need to be deleted from the cache before F can be put into the cache replacing the just deleted cache entry. In the above example B will be the entry to be deleted and replaced by F in the cache. That is at the time F is requested because B hasn't been requested for the longest time compared to all the other entries.