We have a Linux box running Suse 10.0 set up as a router and web proxy
with filtering sharing our DSL connection between 7 Windows XP
computers. It's running squid and squidGuard with a very large blacklist
of forbidden URLs and phrases.
Because we basically have no money the Suse box is an old 400MHz Pentium
II PC with only 256MB of RAM and this isn't likely to change in the near
future, except that I might be able to get some more RAM if necessary.
Squid is set up to run 5 squidGuard processes. When we boot Suse it
takes 15-20 minutes with lots of disk thrashing for the 5 squidGuards to
read in the blacklists and build their tables. During this time the web
proxy is non functional so we usually leave the Suse box running 24/7 to
avoid having to wait for it.
Much of the time it works fine but every now and then for no obvious
reason, squid decides it needs to start more squidGuard processes which
effectively cuts off all web access. I'm not sure exactly what happens,
maybe sometimes it just kills the existing squidGuards and starts new
ones but it sometimes seems to end running 10 squidGuards and thrashing
the disk hard for ages leaving the users with no web access.
When it's all running properly free -m seems to indicated that there is
enough memory:
total used free shared buffers cached
Mem: 250 246 3 0 51 126
-/+ buffers/cache: 68 181
Swap: 400 2 397
Does anyone know what's going on and how to stop it happening?
-- Brian Gregory. brian.gregory05@btconnect.com Computer Room Volunteer. Therapy Centre. Prospect Park Hospital.Received on Wed Jul 12 2006 - 08:22:08 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Aug 01 2006 - 12:00:01 MDT