Hello Guys,
    I'm running Squid 2.5S4 on a RedHat 9 box in a not-small network (350 
machines). Web access is done only with authentication. That's working 
perfectly.
    But, when some machines got infected with ad-ware softwares and viruses, 
these softwares/viruses uses to make SEVERAL http requests for getting some 
data. But those access are all DENIED because these softwares/viruses doesnt 
authenticate in squid.
    Well ...... that's also OK. But the problem is that some of these uses 
to make http requests in a VERY fast rate, like 20 per second. In some days, 
my access.log file uses to reach it's limit (2Gb) before the end of the 
week, where log rotation happens. When that happens, squid crashes and dies 
with:
FATAL: logfileWrite: /var/log/squid/access.log: (11) Resource temporarily 
unavailable
    Question ....... is this 2Gb log file limit imposed by squid or by some 
OS limitation ??? I think it's squid, because I can get files bigger than 
2Gb in the machine ......  Here's a 3.5Gb file being successfully generated 
.....
[root@proxy log]# dd if=/dev/urandom  of=bigfile bs=1M count=3500
3500+0 records in
3500+0 records out
[root@proxy log]# ls -lsa bigfile
3587504 -rw-r--r--    1 root     root     3670016000 Oct 27 18:18 bigfile
[root@proxy log]#
    Question is: Can squid be recompiled with some LARGEFILE option, 
allowing log files to grow larger than 2Gb ???
    Sincerily,
    Leonardo Rodrigues
Received on Wed Oct 27 2004 - 15:22:48 MDT
This archive was generated by hypermail pre-2.1.9 : Mon Nov 01 2004 - 12:00:02 MST