Previous | Next --- Slide 60 of 69
Back to Lecture Thumbnails
afzhang

I just wanted to share something cool I recently found out about Haskell in regards to producer-consumer locality! Everything in Haskell is lazy, so when you apply map to a list all it secretly does is keep track of your map and a reference to the input list. Then when you eventually try to access the first element of the list Haskell will finally actually apply the map (and any other maps/functions you're applying to the list) to give you back that element. This takes advantage of locality really well because if you have multiple maps it can apply all of them to one element of the list before moving on to the next one.

kayvonf

The type of transformation that a compiler might do to improve producer-consumer locality in a stream program is nicely illustrated in the comparison of two pieces of C-code in slide 28 of a later performance optimization lecture.

A program written in a streaming language conveys enough structure to the compiler for it to perform such "loop fusion" optimizations.

parallelfifths

I understand why it might be important for the compiler to figure out prefetching, but producer-consumer locality concerns seem easy for the programmer to do without compiler logic. As a programmer, don't I probably know the size of my buffer/cache? If I do, can't I just make sure that my stream is the same size as the buffer/cache and then have one stream go through all kernels before fetching more data? Why is it necessary to build fancy compilers to do this?