Good $daytime,
> Date: Sun, 15 Mar 1998 08:50:25 -0500 (EST)
> From: Fraser Campbell <fraser@greynet.net>
> To: squid-users@nlanr.net
> Subject: Re: squid-users-digest Digest V98 #99
> I have been considering exactly this.  In our area there is an ISP
> with a service called Cleannet (http://www.cleannet.net).  I don't
> know exactly how they implement their service but I thought Squid
> would be up to the job.
There is a lot of good commercial software and services around the
world.  Our target would be public domain blacklist and/or software
and/or service.
> It sounds like a very simple thing to implement - the hard part
> would be compiling the list of banned sites.
Okay here in squid-users we all know _how_ to implement this.  I'd
like to concentrate on the latter.
> The guide also states that Squid is not optimized to do this for a
> large number of sites.  What would be involved in keeping Squid up
> to speed while doing this.  What non-Squid solutions would be best
> if it is impractical to do with Squid?
I guess usage of splay or binary trees in squid code to be addressed
to performance issues on large ACLs.  Another solution (hardly called
non-squid) is URL-rewriting plugin or recently announced Squirm.
Of course I don't want to say that blacklist can't be used with (say)
Netscape proxy.
  Regards,
  Willy.
--
"No easy hope or lies        | Vitaly "Willy the Pooh" Fedrushkov
 Shall bring us to our goal, | Information Technology Division
 But iron sacrifice          | Chelyabinsk State University
 Of Body, Will and Soul."    | mailto:willy@csu.ac.ru  +7 3512 156770
                   R.Kipling | http://www.csu.ac.ru/~willy  VVF1-RIPE
Received on Mon Mar 16 1998 - 01:32:27 MST
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:39:23 MST