Dear Squid Experts,
I have a relatively slow CGI application that I'd like to accelerate by
caching its output. So far I've tried Apache's mod_disk_cache and Squid
(I'm using the Debian version of 2.5.7) and neither works, I believe
because they both consider the CGI output to be uncacheable. I'm hoping
that someone on this list can offer some advice about how or if Squid
can be made to work in this scenario.
I've put appropriate HTTP 1.1 header recognising and generating code
into the CGI app so that caching should work, and it does work correctly
with mozilla's browser cache. But I'd like to have a server-side cache
so that pages requested by different users do not need to be regenerated.
In response to a normal GET the application sends a 200 response
including the following headers:
Vary: Cookie
Cache-Control: must-revalidate
Etag: "hash-of-database-modification-time-and-any-cookie"
I expect the subsequent requests to include an If-None-Match header, and
I will then send either a 200 response with a new Etag if the page has
changed or a 304 response with the same Etag if it has not changed.
(Generating the 304 response uses a lot less resources than regenerating
the entire page. I do need end-to-end revalidation.) This works using
Mozilla's browser cache.
The "Vary: Cookie" is required because users can log in to the site
using a cookie, and this slightly changes the returned HTML.
Should it be possible to support this using Squid? The FAQ suggests
that the Vary: header is not supported at all, but I have the impression
from elsewhere that this is out of date. Similarly is it still true
that Squid does not support HTTP 1.1, or does not support enough of HTTP
1.1 to do this?
I have removed
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
from the configuration file.
Any advice at all would be much appreciated.
With many thanks in advance,
Phil Endecott.
Received on Tue Nov 16 2004 - 07:50:22 MST
This archive was generated by hypermail pre-2.1.9 : Wed Dec 01 2004 - 12:00:01 MST