> Henrik Nordstrom wrote:
> On tis, 2007-11-13 at 20:39 -0700, murrah boswell wrote:
>
>> I have not fired up my scripts to trigger wget yet. I have been testing
>> by grabbing a few Web pages using a browser and logged into Squid
>> environment as user 'wget.' I am baby stepping my way through this, so I
>> want to get the Squid problem settled first.
>
> Please test using squidclient. It supports logging in an all, with no
> hidden agenda in headers added..
Boy do I feel stupid! Squid was working propely all along and my problem was a total case of operator error.
I was testing sites that are generated out of databases using php scripts, specifically ones developed using PHPWebsite. What I was
"seeing" in the cache using grep when I tried to check if the website was stored in the cache was references to specific items like
.css and .jpg files. None of the actualy code for any of the pages were actually in the cache. So when I tried to go to the websites
as an unpriviledged user, I was getting tcp_misses and relay denied. However, if I entered the full url for one of the .css or .jpg
files that was in the cache, I got them serverd from the cache.
The point being, I was totally testing for something that was not actually stored in cache. I would like to thank everyone for all
their help with me on my percieved problem and would like to apologize for any inconviences my stupidity caused.
What I did realize, and other people pointed this potential problem out early in my threads, was that dynamic websites will cause
many errors for the configuration that I am trying to implement. I now realize that I will probably not be able to implement my plan
since most sites do now serve content from scripts and databases that can not be cached.
Thanks again for everyone's help,
Murrah Boswell
Received on Thu Nov 15 2007 - 18:16:18 MST
This archive was generated by hypermail pre-2.1.9 : Sat Dec 01 2007 - 12:00:02 MST