Hi squid-users,
I'm an experienced web developer who is using Squid for the first
time. For internal testing, we need a stable cache of a certain list
of sites (which we do not own) that we use in our test. I know Squid
isn't built to do this, but I thought for sure it would be possible to
configure it to cache literally all HTTP responses and then use those
for all requests. Here is my very simple Squid 3.1 config that is
intended to do that:
===================================================
offline_mode on
refresh_pattern . 525600 100% 525600 override-expire override-lastmod
ignore-reload ignore-no-cache ignore-no-store ignore-must-revalidate
ignore-private ignore-auth
vary_ignore_expire on
minimum_expiry_time 99 years
minimum_object_size 0 bytes
maximum_object_size 1024 MB
cache_effective_user _myusername
http_access allow all
coredump_dir /usr/local/squid/var/logs
strip_query_terms off
url_rewrite_access allow all
url_rewrite_program /usr/local/squid/similar_url_rewrite.rb
url_rewrite_concurrency 10
url_rewrite_children 10
cache_dir ufs /usr/local/squid/caches/gm 5000 16 256
http_port 8082
pid_filename /usr/local/squid/var/run/gm.pid
access_log /usr/local/squid/var/logs/access-gm.log
cache_log /usr/local/squid/var/logs/cache-gm.log
===================================================
As you can see, I am intelligently rewriting URLs to always match URLs
that I know should be in the cache because I've hit them before. I
find that my hit rate is still only about 56%, and that is mostly 304
IMS hits. I have been unable to find sufficient documentation or debug
logging to explain why Squid would still not cache some requests.
I want very much to use Squid if possible, since it is a stable,
well-known, production server application. Any tips? Is this not
doable with Squid?
Thanks in advance!
Received on Fri Nov 16 2012 - 20:54:49 MST
This archive was generated by hypermail 2.2.0 : Sat Nov 17 2012 - 12:00:05 MST