Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Apache starts up a pool of processes and reuses them for requests. It's all highly efficient.

It's highly efficient if you have a small number of requests at any given time; It's not as efficient for peak hours.

If you had the need, I suppose you could. Although I'm not sure what point you're trying to make here.

There are several use-cases I can think of, the simplest being more than one blocking request for rendering your page, and using multi-process to reduce the latency to the maximal block time instead of the sum of all block times.

You don't really need multithreading. JavaScript has callbacks for everything, for example, but isn't multithreaded. So this isn't so much a problem with PHP as it is a lack of design of cURL. But then cURL is a C library.

Javascript on server environments needs solutions such as node.js. On the client side, well, most UI apps have a single UI thread and delegate jobs to worker threads. In javascript, these are usually done via XMLHttpRequests.



> It's not as efficient for peak hours.

Actually it's quite fine during peak hours. You could, however, say it's a waste of resources during the lull times. But I'm not using those resources for anything else anyway; the entire purpose of the server is to service web requests. If a bunch of processes are sitting there wasting memory being idle, that's not a problem.

> There are several use-cases I can think of, the simplest being more than one blocking request for rendering your page

That's a fair case but I haven't personally encountered it yet. A good example, however, would be building a mashup page of content from a bunch of different web services.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: