One solution is to rotate the logs more frequently.
In squid.conf, specify how many old logs to keep with the line
logfile_rotate 5
where 5 is whatever number you choose.
Then use squid's built-in log rotation with in a crontab line like this:
30 8 * * * /usr/sbin/squid -k rotate
Leonardo Rodrigues Magalhães wrote:
> 
>    Hello Guys,
> 
>    I'm running Squid 2.5S4 on a RedHat 9 box in a not-small network (350 
> machines). Web access is done only with authentication. That's working 
> perfectly.
> 
>    But, when some machines got infected with ad-ware softwares and 
> viruses, these softwares/viruses uses to make SEVERAL http requests for 
> getting some data. But those access are all DENIED because these 
> softwares/viruses doesnt authenticate in squid.
> 
>    Well ...... that's also OK. But the problem is that some of these 
> uses to make http requests in a VERY fast rate, like 20 per second. In 
> some days, my access.log file uses to reach it's limit (2Gb) before the 
> end of the week, where log rotation happens. When that happens, squid 
> crashes and dies with:
> 
> FATAL: logfileWrite: /var/log/squid/access.log: (11) Resource 
> temporarily unavailable
> 
>    Question ....... is this 2Gb log file limit imposed by squid or by 
> some OS limitation ??? I think it's squid, because I can get files 
> bigger than 2Gb in the machine ......  Here's a 3.5Gb file being 
> successfully generated .....
> 
> [root@proxy log]# dd if=/dev/urandom  of=bigfile bs=1M count=3500
> 3500+0 records in
> 3500+0 records out
> [root@proxy log]# ls -lsa bigfile
> 3587504 -rw-r--r--    1 root     root     3670016000 Oct 27 18:18 bigfile
> [root@proxy log]#
> 
> 
>    Question is: Can squid be recompiled with some LARGEFILE option, 
> allowing log files to grow larger than 2Gb ???
> 
> 
>    Sincerily,
>    Leonardo Rodrigues
> 
Received on Thu Oct 28 2004 - 09:26:04 MDT
This archive was generated by hypermail pre-2.1.9 : Mon Nov 01 2004 - 12:00:02 MST