sorry...
my problem is this:
i have a squid server 2.2.STABLE1 behind firewall so i use this lines:
acl all src 0.0.0.0/0.0.0.0
acl local_servers dstdomain ericsson.se
acl CGI url_regex -i ?
http_access allow all
never_direct allow all
never-direct allow CGI
always_direct allow local_servers
prefer_direct off
however every url that has a ? sign in it takes a long time to fetch.
i think that the squid is first trying to fetch it directly and after the
time out goes to the cache_peer.
-----Original Message-----
From: Henrik Nordstrom [mailto:hno@hem.passagen.se]
Sent: Monday, January 24, 2000 8:06 PM
To: Yoram Givon
Cc: 'squid'
Subject: Re: cgi-bin & ? in urls
Yoram Givon wrote:
>
> what do you think i should use
> acl CGI url_regex -i ?
> or
> acl CGI urlpath_regex -i ?
???
You still havent said what you are trying to acheive (or rather why).
Regarding URL matching: You should use the smallest acl matching the URL
component you want to match against to preserve CPU. If you don't care
about CPU usage (rarely a problem unless you have huge lists of
patterns) then it does not matter that much.
-- Henrik Nordstrom Squid hackerReceived on Tue Jan 25 2000 - 04:00:11 MST
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:50:41 MST