I am also having problems with this, but found if you use
acl porn_domains dstdomain "/usr/local/squid/blacklists/porn/domains"
and add a period/full stop in front of the domain in the file, it will block that domain and any subdomains of it
for example
.google.com blocks
Google.com
www.google.com
images.google.com
groups.google.com
fish-soup.google.com
and so on. I believe this is as useful as dstdom_regex.
But just using a 9 meg text file as you are, I have not managed to get squid running in a stable condition or even reconfiguring.
My two questions are
Why does squid's memory usage increase by nearly 320MB when the file is only 9MB?
Which of the redirectors/plug-ins are best for managing large blacklists if this way just won't work on this scale?
Thanks
Paul
-----Original Message-----
From: Carsten Jensen [mailto:cj@datamann.dk]
Sent: 28 February 2006 10:19
To: squid-users@squid-cache.org
Subject: RE: [squid-users] Memory Error when using large acl files
you can see my config here below
as to why.. well I don't want my users to surf porn.
the dstdomain_regex because the file contains fx sex.com
but the homepage can be www.sex.com
If I use .sex.com in the file domains (path below) with the acl dstdomain
I won't even be able to access www.google.com (where the browser then shows
page not found)
acl porn_domains dstdomain_regex "/usr/local/squid/blacklists/porn/domains"
http_access deny all porn_domains
best regards
Carsten Jensen
mån 2006-02-27 klockan 14:30 +0100 skrev Carsten Jensen:
> Hello.
>
> I have this problem that I have a large file in which I have a lot of
> domainnames
> for which I want to block all. The file is around 9 megs.
What kind of acl are you using, and why that kind of acl?
> aclParseRegexList: Invalid regular expression 'domain.tld': out of memory
Looks to me you are esing regex based acls. why?
Regards
Henrik
Received on Tue Feb 28 2006 - 08:50:32 MST
This archive was generated by hypermail pre-2.1.9 : Wed Mar 01 2006 - 12:00:04 MST