Previous | Next --- Slide 56 of 69
Back to Lecture Thumbnails

This slide got me thinking about what it means for the iterations of a loop to be independent. It seems to me that there are two different ways for a loop to be independent. They can be resource independent like this loop, where no iterations share the same resources. They can also be resource dependent, but order independent. Taking a sum of the results of each loop would be an example of this; the sum is a shared resource but the order doesn't matter.

I know you can use parallelize resource-independent loops with ISPC. It seems like you could take advantage of parallelism for a loop that is resource-dependent but order independent...but have we seen the ISPC tools to do this?


@ak47. I like this comment. But in your second example, I'd say that the loops are not independent since they synchronize in their access to shared variables. However, they can certainly be executed in any order without violating the semantics of the program, and that's what's important.