Previous | Next --- Slide 31 of 35
Back to Lecture Thumbnails
unihorn

For details, buffer deadlock means that L1 cache and L2 cache are waiting for each other on buffers. L1 requests L2, but it finds the L1->L2 queue is full. So it waits for the progress of the queue. Meanwhile, L2 requests L1 in the same context. They keep busy on waiting and cannot handle this correctly.

stephyeung

A simple/conservative way to prevent buffer deadlocks is to limit the number of outstanding requests in a buffer by some small bound. If there is no more space for a request, then it will just send back a negative acknowledgement. Another way to avoid deadlocks is to allocate two separate queues for requests and responses, respectively, which is further delineated in the next slide.

edwardzh

Note that this problem will only occur when the cache closer to the processor (in this case, the L1 Cache) is a write-back cache.