Would it be a good idea to make machines that specialize in one type of processing? For example, a webserver for a site that does a lot of little tasks in parallel.
I am wondering how are these machines in such supercomputers configured? Is there one single master or multiple masters in different parts of the room constantly distributing work? What kind of cache coherence mechanisms do these use?
@shhhh, I think it is a good idea to make specialized machines if you know the workload and characteristics of the data to best take advantage of it. However, in the case of a webserver, more thought needs to be put into it. This is mainly because the characteristics of workload and parallelism can change over time. They could just change between the different seasons or when the website or its target audience evolves.