Is throughput always equal to 1/latency and if so are there cases where this analogy is not very useful?
For example say are there any cases where the system has high throughput and so can transfer a lot of data at once but is not very fast at transferring the data. Therefore if you only required a small amount of data for your application, the high throughput would not help you much because you are more concerned with the latency of the data.
@msfernan I believe the formula only applies to the case when only one Internet package / car can be delivered through a media at a time, like the case above.
It is a very limited case, and in general the two values correspond to the highway analogy.
@msfernan Remember that throughput is a rate of jobs or tasks per unit time, while latency is an amount of time. So the formula is actually:
(system throughput) = (# of tasks system runs at a time) / (latency of each task)
As yimmyz points out, the numerator is only 1 here because our system handles one task at a time: the single car allowed on the highway. It's also important to note that the tasks in this example all have the same latency--if we had tasks with varying latencies (in this example, if all cars were not driving exactly the same speed), then the denominator would need to be the average latency for a task during the period of time we want to know the throughput for.