Previous | Next --- Slide 24 of 51
Back to Lecture Thumbnails

Are the little yellow circles switches? If so, why is this O(1) latency rather than O(N)?


I believe in this lecture switches are denoted by yellow squares. This diagram just shows that each node is directly connected to every other node, so O(1) latency.


The yellow circles are, in fact, switches. The delay from any input to any output is bounded and hence, constant. Therefore, it's said that the crossbar possesses a constant rate of delay given by O(1).

source: Advanced Computer Architecture and Parallel Processing, 2.5.1 Dynamic Networks.


It's interesting to note that all complete graphs are non-blocking, and a crossbar is essentially just a complete graph. The most intuitive reasoning is that every node has a path from itself to every other node, so there shouldn't ever have to be any overlapping paths.


Is there a reason each node has a switch that goes to itself?


@thomasts The messages don't stop at the switches, so it's still O(1). Technically, it would take longer for a message to go from 0 to 7 than 0 to 1 because it has to travel a longer distance. This, however, is negligible because the distances are so small.


@FarmerScrub my guess would be that a node can choose to receive packets from any one of the up to four wires that go into it's switch. So you need a switch to handle that.


@FarmerScrub Do you mean that, each horizon link or vertical link is connected, and each switch can choose whether to connect the horizon link and vertical line that it in charge? For example, if 1 want to connect with 7, only the switches at the cross bar of 1 and 7 need to work.


Can there still be issues of contention in a crossbar interconnect? For example, say nodes 1 - 7 all want to send a message to node 0. Would the first vertical wire be a bottleneck? If so, what's a way to get around this? If not, what's the reason?


@MaxFlowMinCut, I think there won't be a contention, because although nodes 1-7 all seem to pass the same switch to node 0, they are using separate links, just like what prof. Kayvon say about ring in the comment of previous slides. And a switch can support multiple links.


@FarmerScrub I am still a bit confused by the latency is O(1). If I understand it correctly, although the message travels O(n) switches, because the networking is non-blocking, so we still treat the latency as O(1) instead? So we take whether it's non-blocking/blocking into account when it comes to latency?


It is o(1) because the length of path doesn't grow with the number of nodes.


My understanding: each yellow dot on the grid represents an end to end connection between source node and destination node. So a message from 0->7 only needs one hop, not eight.


@habini @MaxFlowMinCut @Khryl The way I like to think of a crossbar is as essentially a more sophisticated bus. For each endpoint in the network, there is a single link, shared among all of the nodes in the network for communicating with it. Thus, there must be arbitration because this link is shared among all of the nodes in the network. For example, if 0 and 1 wished to communicate with 7, there would be contention.

There are several different ways the shared links could be implemented. In the above picture, it is implemented as a switched link. At sending time, the network decides which of the 0->7 and 1->7 packets would get priority, and the switches would be programmed to create a path between the nodes, so the latency is still O(1). Alternately, these shared links could also be implemented as buses, which would still require arbitration.


I think every crossbar would have multiple input/output. Every connections has its own path, which is way the crossbar is non-blocking

As for O(1) latency, my guess is delay is bounded with the individual path, instead of growing with number of nodes


@Khryl @TanXiaoFengSheng Through your explanation, can we think crossbar interconnection method as a complete graph?


@haibinl: Were you able to figure out the answer of "So we take whether it's non-blocking/blocking into account when it comes to latency?" ?


@xiaoguaz that's true according to @totofufu