Previous | Next --- Slide 21 of 70
Back to Lecture Thumbnails
huehue

What's the difference between bandwidth and throughput? They both appear to be rates of an operation.

Calloc

You are correct that they are both rates but bandwidth is theoretical vs. throughput is the real rate. Throughput could be and usually is smaller than the bandwidth. For instance, with the highway analogy think about limiting factors like road blocks, accidents, traffic jams. Bandwidth doesn't account for these.

Khryl

@huehue, I think bandwidth is the measurement of ability to transfer data, while throughput is the actual transferring rate of the data. Throughput can be zero if no data is transferred, but bandwidth is always there.

arcticx

@huehue Bandwidth is a property of the transmission medium, while throughput is the actual speed at which the data is transferred (limited by bandwidth).

Detailed discussion (in comments)

xiaoguaz

FYI, CPU is focus on latency and GPU is focus on bandwidth.

totofufu

@xiaoguaz I didn't know that, could you expand a little more on that? How exactly does the CPU focus on latency and GPU focus on bandwidth? Do you mean that they're specialized to be particularly optimal for those respective skills? How does it do that?

Updog

@totofufu, To elaborate some, it's mainly a matter of memory bus width and the bus clock rate. Here are some specs from a recent CPU and GPU: i7 4790k, GTX 980. The CPU has 2 memory channels, each 64 bits wide, supporting up to 1600 MHz DDR3 memory. If you multiply those numbers out, you get the 25.6 GB/s max memory bandwidth that Intel lists. The GPU has a 256-bit bus, and GDDR5 memory running at 7 GHz transfer rate, which gives 224 GB/s max bandwidth. As you can see, the GDDR5 memory runs at a much higher bus transfer rate than DDR3, though I don't know the details of how exactly it manages to do that.