Site throughput doesn't change. You're still going to be processing the same number of requests per second.
Yes, latency will be the one changing as shown in the slide since the time in queue increases.
I think as the request increases from 0 to number larger than R, throughput first increases and then remains R per second. The graph is a slope and then a line. And latency first is, say T seconds. Then may become T + L/R seconds. The graph is first a line and then a line that is higher than previous one.
Throughput will be the same, but as more people are sending requests, latency will increase. Would it be better to service a group of users at decent latency, and queue up other users? In the case of a website being bombarded with users maybe due to some viral content, what would be the best case in handling that?