Previous | Next --- Slide 13 of 57
Back to Lecture Thumbnails
chenh1

So what schemes used in multi-threaded web server to protect the security and avoid inter-thread disturbance?

ask

Using processes instead of threads helps in sandboxing the execution of different requests. This ensures that multiple requests don't share the same address space during execution and also, crashing a single process does not affect the execution of the other requests.

yulunt

If multi-threaded approaches are used, we may assume third-party libraries are not thread-safe and handle concurrency issues explicitly to protect security. It may introduce more synchronization overheads and affect latency.

shhhh

Does splitting into processes not produce a latency problem with caches?

paracon

@shhhh I am not sure I understand the latency problem you are referring to. Could you elaborate?

themj

What does recycling workers mean? Does it relate to processes not being freed properly?

chenboy

I think that in a web server that is implemented correctly, multi-thread server have more advantages because the context switching are more lightweight and the programmers have more fine-grained control over the scheduling method

vadtani

@chenboy I did not understand what you meant by 'context switching are more lightweight'. How are context switching more lightweight in web servers?

jedi

@themj Apache periodically restarts processes as empirically a long-running Apache worker process has a large memory footprint. This is generally credited to memory leaks and process bloat.

@vadtani, I believe that @chenboy is comparing multi-threaded to multi-process web servers, where context switching is definitely more lightweight, and programmer-defined schedules are easier to generate.

And the latency problem due to shared caches seems to be because of shared address spaces in multithreaded systems that allow lower latency sharing of common areas memory than multiprocess systems.