On Thu, 2003-05-08 at 00:05, Henrik Nordstrom wrote:
> > Alternatively, is there a deterministic way to get a list of
> > variants from squid for a URL? We cache rather aggresssively, and
> > we want to be able to tell squid to drop its current copy,
> > programmatically, when we update our backend database for a URL.
>
> Not very easily in 2.5, but you should be able to use a modified
> version of the purge tool I suppose. The request variance is stored
> in the object meta header on disk in the STORE_META_VARY_HEADERS TLV
> (8) using the syntax Header=Value, Header=Value, .... or just Header,
> .. if the header was in mentioned in Vary but not part of the
> request.
We're batting around if it's worth it to throw time and energy at this
solution, or try for some other kind of workaround. It seems like a
sane thing to want to do, though, no? "This object that you cached has
changed, please forget about all instances of it."
I suppose we're sort of approaching the problem backwards, having the
content provider pro-actively tell the cache about changes, but that is
basically what our application demands. We're currently programatically
forcing squid to refresh content when it changes, but when we switch to
a situation where Vary comes into the picture, that whole scheme breaks
badly.
> If the responses have a ETag and you use the ETag patch for 2.5 then
> Squid keeps an index of what variants it has for a given URL and
> their request variance. This however requires that you send different
> ETag:s for different encodings and for best function that your server
> also supports If-None-Match (last time this was discussed mod_gzip
> supported neither.., making a mess of things by claiming the identity
> and gzipped entity are the same entity).
We are also toying with the idea of adding proper ETag and If-None-Match
support to mod_gzip, ourselves, at this point. I took a look at the
ETag squid patch, and it looks interesting in principle. Doesn't apply
cleanly to 2.5.STABLE2, but I think I managed to massage in the rejects
into the correct places, so we'll be taking a look at that.
This is the part where I wonder out loud if we are the only group in
creation that's trying to do this gzipping and caching in a reverse
proxy situation, or if it's just our need for long cache times and
conditional purges that is unique.
Thanks for the input.
-- R Pickett <emerson@craigslist.org> craigslist.orgReceived on Thu May 08 2003 - 18:30:30 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:16:28 MST