I have checked the memory but there is plenty available!!! Also file
descriptors!!!! they does not exceed 700, even when the server is very slow
in cache manager. I have tried to get cached pages using the client program,
and the response is very normal. Even across LAN the response is very
good.....
No errors concerning file descriptors or memory come up in cache.log!!!
-------------------------------------------------
Hossam El-Ashkar
h_elashkar@ieee.org
----- Original Message -----
From: Alex Rousskov <rousskov@ircache.net>
To: Hossam El-Ashkar <hoashkar@idsc1.gov.eg>
Cc: squid <squid-users@ircache.net>
Sent: Monday, May 31, 1999 19:31
Subject: Re: web interface
> On Mon, 31 May 1999, Hossam El-Ashkar wrote:
>
> > I have tried to use the client program instead of the web cgi. the same
> > thing happened, it took a very large time to get the output!!!
>
> How long does it take to get a regular page??? Try requesting some
cachable
> URL several times using the "client" program and see what kind of response
> time you have. Perhaps the problem is not speific related to cache
> manager!!! Also, if it starts OK and then slow downs, you are probably
> running out of some resource like memory or file descriptiors. Any
warnings
> in cache.log???
>
> Alex.
>
> > Could it be the solaris that is making the problem?? it is 2.7 x86!!!
This
> > is my first try on it!! The server is not even heavily loaded, it has
about
> > 500 to 600 request per minute!! It even caches on RAID II storage!!
> > So what could be the reason for squid to take soooo long to give me
> > statistics??? Note that when it is freshly started, it works normally!!!
>
>
Received on Mon May 31 1999 - 12:48:41 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:46:30 MST