If you do use a refresh pattern to force freshness, make sure it specifies
ONLY this resource, or there may be... trouble.
For instance, if the resource is
http://foo.com/cgi-bin/bar.cgi
use
refresh_pattern -i ^http://foo.com/cgi-bin/bar.cgi ...
not
refresh_pattern -i /cgi-bin/
or you'll be forcing freshness on *everything* that matches /cgi-bin/.
Don't worry so much about taking off the no_cache rules; dynamic pages won't
have validators anyway, so they shouldn't be cached. (Henrik/anyone, can you
confirm this for Squid?)
<shameless plug>
For more information to give the Webmaster (where this really should be
fixed), see
http://www.pobox.com/~mnot/cache_docs/
</shameless plug>
> -----Original Message-----
> From: Henrik Nordstrom [mailto:hno@hem.passagen.se]
> Sent: Wednesday, May 12, 1999 12:13 AM
> To: Knut A. Syed
> Cc: squid-users@ircache.net
> Subject: Re: Caching dynamic pages (CGI) for one service/server
>
>
> Knut A. Syed wrote:
> >
> > I would like to cache pages for one specific service, (actually one
> > server,) even if they contain the strings "cgi-bin" and "?".
>
> 1. Don't use the recommended no_cache rules for blocking
> caching of such
> URLs.
>
> 2a. Use a refresh pattern to set a min refresh age on the URL pattern
> you want to cache.
>
> 2b. Convince the CGI-script owner to change the script to generate
> expiry / last-modified information to allow caching.
>
> --
> Henrik Nordstrom
> Spare time Squid hacker
>
Received on Tue May 11 1999 - 17:53:36 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:46:15 MST