Dear all,
Our Top-level Web caching server is processing 800,000 to 1,000,000
requests a day with a maximum of 20 requests per second on a (non-dedicated)
Digital AlphaServer-4000 with Squid v1.1.8.
25% of the requests are TCP requests with 7% HITS and 75% are UDP requests
with a HIT-rate of 5%. Nearly 75% is HTTP and 25% of the traffic (in Bytes)
is FTP traffic. [The number of FTP requests is negligible however the
traffic measured in Bytes is reasonably substantial (25% of the total).
On certain moments however the Squid process seems to be locked up which
can only be solved by a kill/restart of the Squid process..
What components of the system (memory, disk, I/O, network bandwith, CPU)
are restrictively to the maximum number of requests a system can
handle?? And (guessing) what is maximum achievable for squid on say a
Digital AlphaServer-4000 ...
Cheers, Henny
-----------------------------------------------------------------------------
E-Mail: Henny.Bekker@cc.ruu.nl ! Disclaimer: The main obstacle to progress
http http://www.ruu.nl/~henny ! progress is not ignorance,but
PTT: Voice: +31 30 2536971 Fax: +31 30 2531633 ! the illusion of knowledge
X400: /G=Henny/S=Bekker/OU=cc/O=ruu/PRMD=surf/ADMD=400net/C=nl o
Paper: H.J.Bekker, Utrecht University, Computer Centre _ /- _
Po Box 80011, 3508 TA Utrecht Nederland (_) > (_)
-----------------------------------------------------------------------------
Received on Thu Mar 06 1997 - 05:44:59 MST
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:34:38 MST