Hello,
Ive setup my first reverse proxy to accelerate a site, ive used wget
to spider the entire site several times and noticed that even after
running it some files never get cached like html files! I presume it
is because the htmls dont have the correct cache headers.
It didnt even want to cache up .swf files, but then I added this line
and it helped a lot but not completely.
refresh_pattern . 0 20% 4320 ignore-reload
Iam thinking then the best approach is to make squid cache EVERYTHING,
and then manually give it specific exceptions of dynamic content (
like .php and some .html with embedded php scripts). I dont want to
start editting files because I want to test the performance increase
before adding headers one by one.
If anybody suggests something better,, could someone advise me how to
force it to cache everything and an example on how to make exceptions?
Ive been looking at the faq without much help...
Thanks!
Andres
Received on Wed Jul 29 2009 - 22:16:48 MDT
This archive was generated by hypermail 2.2.0 : Fri Jul 31 2009 - 12:00:04 MDT