> From: Henrik Nordstrom <henrik_at_henriknordstrom.net>
> Date: Tue, 13 Oct 2009 12:54:30 +0200
> To: Ross Kovelman <rkovelman_at_gruskingroup.com>
> Cc: "squid-users_at_squid-cache.org" <squid-users_at_squid-cache.org>
> Subject: Re: [squid-users] Bad url sites
>
> mån 2009-10-12 klockan 23:12 -0400 skrev Ross Kovelman:
>> I use a file called bad_url.squid to represent sites I want blocked. I
>> think I have reached a limit to what it can hold as when I do a reconfigure
>> it could take a few minutes for the data to be scanned and processing power
>> gets sucked up.
>
> What kind of acl are you using?
>
> Tested Squid-2 with dstdomain acls in the order of 100K entries some
> time ago and took just some seconds on my 4 year old desktop..
>
> Did a test again on the same box, this time with a list of 2.4M domains.
> Total parsing time is then 55 seconds with Squid-3 or 49 seconds with
> Squid-2.
>
> Regards
> Henrik
>
This is what I have:
acl bad_url dstdomain "/usr/local/squid/etc/bad-sites.squid"
Should I use something else? Is this the best way? I am on squid 2.7
Thanks
This archive was generated by hypermail 2.2.0 : Wed Oct 14 2009 - 12:00:02 MDT