All
I have configured squid to run in accelerator mode in front of our application server.
ie
User Browser <--------> Squid <--------> Application Server
In our case, all URL's uniquely define a page, so we can cache them with no fear of any session information meaning that one user should see a different page for the same basic URL. The URLs are generally of the form http://<host>/a_jsp_page?arguments. As our underlying data only changes daily, i would like to just generate each URL only once, and let squid serve copies to everyone else. Hence, I also restart squid daily and clear the caches down.
I believe the following refresh_pattern sould do the job
refresh_pattern . 1440 100% 1440 ignore-reload override-lastmod override-expire reload-into-ims
Unfortunately, this does not force all object to be cached - I still get some TCP_MISS's for identical requests. Some dynamic pages do get cached, but not even all the static images are cached (as I can see via the TCP_MISS messages in the log).
Is there a way that I can configure squid so that I can see why the URL is a TCP_MISS (ie expired or whatever)? Is there something else I am missing, to force squid to cache everything?
I am using 2.5-STABLE on RedHat 9
Any pointers would be appreciated
Many thanks to you all
Tony
-- This e-mail may contain confidential and/or privileged information. If you are not the intended recipient (or have received this e-mail in error) please notify the sender immediately and destroy this e-mail. Any unauthorized copying, disclosure or distribution of the material in this e-mail is strictly forbidden.Received on Thu Nov 25 2004 - 01:48:29 MST
This archive was generated by hypermail pre-2.1.9 : Wed Dec 01 2004 - 12:00:02 MST