- > >Have been using b17 and so far is working fine. However, would like to
- > >know is there any theoretical or empirical result for having 2000 acl or
- > >is the performance preditable for such a big list?
- > >
- > The current implementation may be too simple for so many. They are stored
- > in a simple linear linked-list, so there may be too much time spent
- > traversing the list. It will depend on how busy your cache is.
- >
- One thing ive been thinking about...
-
- The most common use of an access list of that size would be to ban
- unacceptable sites.. A very NICE feature would be a special
- config file that would contain partial URL's.. ie/
-
- http://www.blah.com/sexytoys/
- http://www.sex.com/
If you wish to block an entire site (eg: playboy.com) then an addition to the
DNS code plus configuration may be a less of a performance penalty.
Allow config of DNS entries via the squid.conf file. If you can specify
permanent entries either positive or negative then you could put a permanent
negative entries in the DNS cache for the unacceptable sites. This is like
adding/deleting ARP entries.
This wont handle partial URL's where you want to block only some areas of a
site (eg: only the http://www.blah.com/sexytoys/ section). But it is a start.
Permanent positive entries could be useful as well though you would have to
be careful with them.
Thinking some more, if it could be done via the cachemgr.cgi interface (or
similar) instead of the config file & restart, it would allow testing of
improved DNSserver code. ie: delete a DNS entry, reload page, check on stats
for DNS performance, delete a DNS entry, ...
-- Neil Murray Email: neil@aone.com.au Access One Pty. Ltd. http://www.aone.net.au/ 41 Malcolm Rd., Braeside Phone: +61 3 9239 1444 Victoria, Australia 3195 Fax: +61 3 9580 5581Received on Mon Jul 01 1996 - 20:21:58 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:32:34 MST