> Is Squid able to act as HTTP load balancer ? Idea: one Squid receives
> all requests for a www.***.* site and forwards the requests to a farm
> of httpd daemons behind a firewall, takes the responses and send them
> back to the browser.
>
> The problems:
> - Squid must have some knowledge about the current load of every httpd
> - In some cases there is a need that a series of requests from a specific
> client is sent to the same httpd (session management, local data of the
> http)
>
> Or are there better solutions available (most commercial solutions are
> rather expensive.
You know, random distribution could be quite reasonable; write a redirector
like:
* if URL isn't a "special" one -> return server[random() % servers]
* if URL is a "special" one -> return server[hash(client_address) % servers]
If you want to be smarter, you could try to maintain a shared memory segment
with one word containing the latest load average for each server (multiplied
by a constant factor -- use integers since then you don't have to concern
yourself if a read during a write could return a completely bogus answer
due to reading two halves of a different double -- writing one positive integer
over another, and reading while that is happening, will return a value between
the two integers [assuming EREW for each byte]). Then you could bias your
answers based on current load averages for "normal" URLs (hoping this is
where most of the load comes from).
David.
Received on Mon Oct 05 1998 - 02:33:06 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:42:20 MST