Previous | Next --- Slide 20 of 70
Back to Lecture Thumbnails
jhibshma

I think I finally understand how latency does and doesn't relate to throughput. If you decrease latency by "shortening the road," cars will appear at Pittsburgh at the same rate (throughput) as before. On the other hand, if you decrease latency by making the car faster, you will also increase throughput.

Now, making the road twice as long and doubling the speed of the cars would be the same as simply adding a second lane.

In a real, physical memory transfer system, is there a parallel to all three operations (change road length, change speed limit, add/remove lanes)?

anonymous

@jhibshma I believe there are parallels for the three above operations in a physical memory transfer system. Changing the length of the wire, depending on the type of communication, can effect the transfer. For example, USB has a limited length, and does not work past it. I'm not sure exactly about changing the speed limit, but I assume it is pretty easy to change the hardware for different speed throughput. Adding/ removing lanes is similar to changing the overhead required with sending all of the data, by removing the length of headers/footers with each piece of data, even though the bits/sec that is being transferred stays the same, you are able to send more data across.

jaguar

Another way of looking at the difference between latency and throughput is that if you put an 3.6 terabyte hard drive on an airplane and flew it between Washington, DC and Pittsburgh, the latency would be about 1 hour, i.e. the length of the flight. However, the throughput would be 1 gigabyte per second, i.e. the amount of data reaching Pittsburgh on average each second. Hence, you could increase throughput by putting more hard drives on the plane. You could also increase throughput by decreasing latency, i.e. flying the plane faster.

aeu

Maybe not entirely related to memory bandwidth but the whole 'throughput & cars' analogy made me think of Andrew Tanenbaum's famous quote "Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway". This XKCD What If question takes a scientific look at the throughput of various methods of data transmission.

maxdecmeridius

Is having cars spaced by 1km analogous to pipelining? So would decreasing the space between cars be the same as increasing the hardware's capability to pipeline data computation?

colorblue

@maxdecmeridius

Yes in this example if you think of the road an a sequence of instructions where only one car can access that segment of road. If you decrease the distance between the cars this is similar to decreasing the cycle time for the pipeline and therefore increasing the amount of steps in the pipeline. Interestingly enough by increasing the cycle time and amount of steps this will most likely increase the latency.

doodooloo

Is throughput the same as bandwidth? Based on what I read online, that seems to be the case. If that is true, then I would consider throughput as a toll station which only allows cars passing through at a certain rate, so letting the cars drive faster (as shown in this slide) should not affect the throughput at all. Unless my understanding of throughput is incorrect, I guess this slide should be amended.