Previous | Next --- Slide 37 of 46
Back to Lecture Thumbnails
mperron

Is it likely that the convergence will happen where the arrows indicate, or is there evidence to suggest that some kind of happy medium, with medium computational and data intensity. This graph suggests that a system with both great data intensity and computational intensity could be built. Is there some fundamental trade-off between the two?

PID_1

@mperron. I think the tradeoff is how much everything is tuned for the specific system. In the supercomputing world, everything is tuned to work exceedingly well on a very specific hardware configuration (removing runtime overhead, even to the point of not needing a virtual memory system). In the big-data world, inputs are heterogeneous and often from streaming/online sources, so there can't be the level of baked-in tuning that supercomputing has

418_touhenying

It may either be that machines that do well on both will appear, or that there will be trade-offs, which probably won't make a "convergence".

ojd

@mperron: Most likely, it is a resource limitation. Computation & Data are somewhat orthogonal axes, so optimizing both will take much more of a resource cost and result in much more complexity than just one alone. Economics dictate that companies will tend to want to construct the cheapest system that will meet their needs, so there just wasn't a need for a system that could handle the unnecessary extra axis. Maybe these new needs will motivate change in this area.

bdebebe

@mperron It's very reliant on money as @ojd mentioned. It's a cynical way to look at it, but that type of research and development will only happen if there is a push in the right direction by people with the money to do so, but it seems that as of late that push is happening more often.