Previous | Next --- Slide 28 of 64
Back to Lecture Thumbnails
Vincent

I have a question that is it still better to make an even distribution across all the "child" thread? It seems that those free childs will steal the works from other threads.

firebb

@Vincent Yes, I think it would be better to make an even distribution if we already know the workload, which has less overhead. However in many cases, we cannot know that in advance. Another thing is that even distribution differs a lot from task to task. Task queue like this is a more general method and provides a easier way to use.

Vincent

@firebb Thanks. That's what I think but I was just not sure. For the quicksort case, the workload is not deterministic and it really depends on the input. It looks like that it will be wiser to generate a good strategy to fill the task queue for making the even distribution.

kayvonf

@Vincent. Careful about your use of the term "nondeterministic". Non-deterministic means that different executions of the program with the same inputs might get different results. For example, if the order that independent work is executed effects the output of the program, than the output might be non-deterministic. A program like quicksort is certainly deterministic and gives the same output regardless of how it is scheduled in parallel.

However, I could see a situation where the different spawned tasks might be different if the implementation of partition() used a random choice of pivot, and therefore, even with the same initial seed different orders of execution would result in different subtasks getting different random pivots. This wouldn't effect the final sorted result, but it would potential cause different subtask breakdowns on different executions on the same input array.