What happens when the cache is full and a cache miss occurs? The incoming data will need to replace an existing entry in the cache. But, which entry?

The decision about which populated entry will be replaced with new data is made by the cache’s replacement policy. This decision might be random or it might use information about the cache entries. When choosing a replacement policy for an architecture, designers look at how to improve performance while keeping the design simple.

First In First Out (FIFO)

This policy replaces the entries in the order that they came into the cache. An index is maintained by the cache that points to the next entry to be replaced. After replacement, the index is incremented or set back to the first entry if the last entry was just replaced.

Least Recently Used (LRU)

This policy replaces the entry with the most time passed since it was last accessed. This requires that each entry have a way to keep track of when it was last accessed compared to the other entries. This implementation is the most difficult to implement and the design may not be worth the improved performance.

Random Replacement

This policy chooses a cache entry at random. It is easier to implement than the FIFO and LRU policies.

The correct replacement policy is key to increasing the number of cache hits achieved by the processor. The random replacement policy is simple to implement but might cause more cache misses than the other 2 policies. The LRU policy is more complicated but tends to do a better job at keeping data in the cache that will be used again.



Run the code. The instructions in this exercise retrieve more than four pieces of data.

The first 4 characters retrieved from data ("T", "h", "i" and "s") stay in the cache throughout the instruction execution. The characters "i" and "s" are cache hits later on, but the 2nd "p" is a cache miss.

With only 4 blocks in the cache, replacement is now necessary to improve performance.


Inside the Cache() class, three methods implementing replacement policies have been added:

  • .random_policy() returns a randomly selected entry index.
  • .fifo_policy() returns the next in line entry index, using the self.index variable.
  • .replace_entry() takes an address and data as parameters and uses one of the above methods to update self.data.

To implement a replacement policy in the Cache() class, make the following change inside the else clause of the .read() method:

  • Replace .add_entry() with .replace_entry(). Keep address and data as the parameters.

When you run the code, the first "p" accessed is a cache miss and the 2nd one is a cache hit.


Since we are using the .random_policy() method to retrieve the next entry, the hits, misses, and execution time when repeatedly running the instructions will not be consistent. To get a more consistent result, try out the FIFO policy.

Inside the .replace_entry() method:

  • Replace self.random_policy() with self.fifo_policy()

The cache hits, misses, and execution time will now be more consistent.

Sign up to start coding

Mini Info Outline Icon
By signing up for Codecademy, you agree to Codecademy's Terms of Service & Privacy Policy.

Or sign up using:

Already have an account?