Previous | Next --- Slide 3 of 47
Back to Lecture Thumbnails
regi

relevant xkcd on increasing bandwidth (at the cost of latency).

kayvonf

Question: In class @BryceToTheCore mentioned that when he thinks of latency, he thinks of the mail. What are other good real-world operations that make you think of latency? What about bandwidth?

For example, other common metaphors for latency and bandwidth are:

  • water pipes
  • highways

Would anyone care to expand on these metaphors here?

bwf

We can think of a highways speed limit and the number of lanes it has in regards to latency and bandwidth. If car's are data, then the speed limit defines how fast the cars reach their destination(assuming there is no traffic). The number of lanes on the highway are like bandwidth because it determines the how many cars can be traveling at the same time.

For water pipes, we can use the water pressure and thickness of the pipe in our metaphor for latency and bandwidth. The water pressure determines how fast water will be able to travel from point A to B, whereas the thickness of the pipe determines how much water can be traveling in the pipe at any given time.

jcarchi

Another way is to visualize latency is fiber optic cables. Since they use light, the latency is the amount of time it takes light to travel through the cable. Interesting application of decreasing network latency: http://www.bloomberg.com/bw/articles/2012-04-23/high-speed-trading-my-laser-is-faster-than-your-laser

oulgen

Another way to think about latency would be online video games. Most games have the concept of showing the user how much latency they have. This is generally displayed as Ping. I believe this terminology comes from how long it takes the client to ping the server and for server to respond back. It is generally assumed that 0-20 ms latency is okay for a comfortable game play.

doodooloo

I found in both assignment 1 and assignment 2, all the bandwidth values provided are calculated by (the total bytes / the total time taken). I am not really comfortable with the idea of using the total time taken in our calculation, because I think the time taken for computation is not relevant as we are analyzing the memory bandwidth. Or maybe I am wrong, and it is relevant?

Another question is I realized the definition of bandwidth in this lecture is slightly different from our previous definition, so is bandwidth only a concept associated with memory/communication or a concept we can talk about for all kinds of operation?

kayvonf

The code in the homeworks computed bandwidth by dividing the total number of bytes transferred by the total time of the computation. Of course, there's an assumption there that during this entire period that data was being transferred, but in those cases it's a reasonable assumption. Really what we were measuring was the overall average bandwidth observed over that period of time.

mangocourage

Is there a relationship between bandwidth and latency then? or is it better to think of bandwidth as how fast you can pull data from a pipe once all of said data has "arrived" at the other end of the pipe?

kayvonf

@mangocourage: Both of your comments are startup. Startup latency will limit overall effective bandwidth if messages are not pipelined. But with proper overlap, effective bandwidth of a link will be the link bandwidth: the rate data comes out the end of the link. (as you say)