On Mon, 13 Jul 1998, Jeff Madison wrote:
> I am running squid 1.2 beta 22 and have a quick question. How do I
> force the server to refresh a site that changes either once a day or
> many times a day. I know about the -m PURGE and -r options for the
> client but I need to know how to tell squid to update the entire web
> site. One site is http://www.usatoday.com for example. This site
> changes several times a day but the names of the files and graphics
> never change. I don't want to exclude this site from being cached
> because it is one of the more common sites for our users and is very
> slow when not cached. I need to know if there is a way to tell
> squid to refresh all objects in it's cache associated with the base
> URL of http://www.usatoday.com with out referencing every object
> individually.
I'm not sure I see what you mean about USA Today -- it seems that the
images and graphic files that would be subject to change (ie,
everything except the banners, buttons, etc) are named appropriate to
the subject of the image, or are sequentially numbered. I may just be
missing it, though.
One thing you could do, and that I recommend everyone do, is send
e-mail to the webmaster asking that they implement "Expires:" headers
on the images which change without a filename change. Let's flood 'em
with requests.
Another thing to do is change the refresh_pattern for that site:
refresh_pattern/i http://www.usatody.com/*.gif 20 20% 1440
This would force any image on USA Today's website to be checked for
modification if it was older than 20 minutes. You'd still get the
cache benefit if the image hadn't been modified, but you'd lessen the
likelihood that someone would get a stale image.
-Mike.
Received on Mon Jul 13 1998 - 14:15:54 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:41:06 MST