Hello,
I'm new to this list but have searched the archive for my following
problem. My setup is as follows:
I set up a squid as proxy cache for our department which is connected over
a slow line to the main firewall of our institute.
[local-1]
\
[local-2] - [squid-cache] --(slow line)--> [firewall of institute]
/
[local-3]
Now the situation has changed and we have to use a proxy instead of the
"firewall of institute". I just get 403 errors with my current (mostly
standard) squid setup or even if I try to connect directly to the
internet from my workstation. I'm using squid 2.2.5 from Debian 2.2.
The list archive mentioned the cache_peer option which should be
my friend to solve this problem, but I suspect I did not use the
right options. I tried
cache_peer wittenau.ivbb.bund.de parent 80 7 no-query
and
cache_peer wittenau.ivbb.bund.de parent 80 3130 no-query
but I've got in both cases:
Error 500
Reason: Can't locate remote host.
_________________________________________________________________
Utimaco Safeware AG
I really don't know anything about the proxy
which is used and I hope that no-query was the right option to tell
squid that it doesn't try to obtain something from the cache. I don't
understand the ICP concept but I'm afraid that the used proxy isn't
as clever as squid might be.
991391520.850 185 10.15.140.59 TCP_MISS/403 289 GET http://tower.physik.uni-halle.de/ - DIRECT/tower text/html
991395122.707 2423 10.15.140.59 TCP_MISS/500 499 GET http://tower.physik.uni-halle.de/ - FIRST_UP_PARENT/wittenau.ivbb.bund.de text/html
Any idea how I can chain the proxies right?
Kind regards
Andreas.
Received on Fri Jun 01 2001 - 06:57:50 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 17:00:27 MST