You could make the queues bigger to prevent this deadlock. You can also check for the system to make sure there is space in the queue before adding a request.
cluo1
We could have a logic in L1 Cache that checks L1->L2 queue. If the queue is full, then go serve the request from L2->L1 queue instead block and waiting in L1->L2 queue. This could prevent deadlock.
Bye
@cluo1 I think the idea is that the request from L2->L1 queue will generate a response in L1->L2 queue, which is already full. This problem inspires the separate queues for request and response in later slides.
jedi
To eliminate the circular dependency, you could:
agree on an ordering scheme, e.g. grab L1L2Q before L2L1Q; thus the L2 cache must make a reservation on the response to L1 before issuing an operation; however this ordering scheme does not scale well as the number of queues increase
abort a message in case of a deadlock
(see next slide)
chenboy
So the problem is that both L1 cache and L2 cache are processing a request, and the queue size is 1 so neither of them can put the response back to the queue. The resource here is the work queue, L1 and L2 both holds the resource that is acquired by the other cache, thus a deadlock is formed.
You could make the queues bigger to prevent this deadlock. You can also check for the system to make sure there is space in the queue before adding a request.
We could have a logic in L1 Cache that checks L1->L2 queue. If the queue is full, then go serve the request from L2->L1 queue instead block and waiting in L1->L2 queue. This could prevent deadlock.
@cluo1 I think the idea is that the request from L2->L1 queue will generate a response in L1->L2 queue, which is already full. This problem inspires the separate queues for request and response in later slides.
To eliminate the circular dependency, you could:
So the problem is that both L1 cache and L2 cache are processing a request, and the queue size is 1 so neither of them can put the response back to the queue. The resource here is the work queue, L1 and L2 both holds the resource that is acquired by the other cache, thus a deadlock is formed.