Should the implications of this slide have any effect on mobile phone app developers? The idea of recomputing values to save energy may be more of a problem for compilers rather than programmers, but there should still be an effect from writing code that exploits locality. Do app developers have incentives to care about energy costs?
More cache hits indicates fewer data movement efforts. This Jigsaw paper tries to tackle the data movement problem by proposing novel resource-management algorithms which at runtime both maximize cache utilization and place data close to where it is used.
@Josephus I would think that it actually would be the app developer who would be more responsible for managing these energy constraints than the compiler. While a good compiler may perform steps like common subexpression elimination (e.g., something like (x+y) * (x+y) would compute the sum only once and use the stored value twice), it's usually on a very local scale such that the data is still in the registers at the time of re-use. For anything so large that it's going all the way to DRAM, the compiler won't be auto-memoizing, so it's the programmer's responsibility to explicitly choose between storing and recomputing based on the energy constraints.
Well one good example of a decision an app developer might be faced with would be an algorithmic choice between using a precomputed lookup table storing a tabulated f(x), or just computing f(x) on the fly.
This is a nice blog post about saving bandwidth on ARM Mali mobile GPUs:
How low can you go? Building low-power, low-bandwidth ARM Mali GPUs, by Tom Olson.
And Mike Shebanow's HPG13 keynote should be interesting to some of you. See the comments about memory on slide 20.