Previous | Next --- Slide 28 of 72
Back to Lecture Thumbnails
williamx

Is it possible to decrease latency without improving bandwidth?

whitelez

@williamx I think the only way to really reduce the latency would be improving the physics property of hardware. Because the latency is bind by the limitation of the physics (mainly speed of light). However, it is possible to reduce the "observable latency" in the real world. That approach is adding cache to the communication.

khans

So bandwidth and latency are not related, but together give us throughput? Like throughput = bandwidth/latency?

shpeefps

There are also several programs I can imagine that have variable latency. So I guess we could define throughput = bandwidth at that time/latency at that time.

chenboy

Correct me if I'm wrong. I think both bandwidth and throughput are used to measure the rate of data being processed. So we can't say throughput = bandwidth / latency. Keep in mind that in lecture 2 we are told that throughput can be limited by memory bus bandwidth, meaning that the rate of the computation task fetching and processing data can be limited by the maximum data transfer rate of the memory bus, in other words, the bandwidth of memory bus.

harlenVII

Latency is hard to optimize, but there are several ways to "avoid" it. Like more execution parts to hide memory latency or using cache to avoid access memory.

SR_94

@chenboy I think you are right. Bandwidth is the maximum amount of work that can be done in a given amount of time while throughput is the actual amount of work that is done in a given amount of time. So throughput != bandwidth/latency.

fxffx

To calculate latency, we only need to consider a single operation, and see how much time it needs in order to complete. To calculate throughput, we need to consider a group of same operations, and see how many operations can complete in a fixed amount of time.