Hi there!
I would like to scan web-traffic for viruses, filter access by ips and 
urls and cache the whole thing.
My setup at the moment:
client----------squid----------dansguardian-----squid----------web
-------------squidGuard---------clamav---------------------------
I am not happy about using two different squids. But the problem is: 
dansguardian needs a proxy to forward its requests to. But with 
dansguardian, I lose the information about source ip. This soucre ip is 
needed to apply client-specific access-rules. Therefor, the first squid 
is needed (with squidGurad).
Dansguardian suggests a solution: a patch called follow_xff (xff: 
x-forwarded-for) which would be applied to the 2. squid so this server 
could determine the real source-ip.
But the patches on http://devel.squid-cache.org/follow_xff/ apply only 
to two specific source-trees. What about actual builds? Is there another 
solution?
I like dansguardian, because it is coded in c (and not in slow perl / 
..) but two squids, dansguardian etc. -> slow surfing!
Greets
jacusy
Received on Thu Feb 09 2006 - 04:19:04 MST
This archive was generated by hypermail pre-2.1.9 : Wed Mar 01 2006 - 12:00:03 MST