I've got it. I set the option "forwared-for" from off to delete and now both website gets displayed thru squid.
Kind regrads
Marc
-----Ursprüngliche Nachricht-----
Von: Amos Jeffries [mailto:squid3_at_treenet.co.nz]
Gesendet: Dienstag, 26. November 2013 13:11
An: squid-users_at_squid-cache.org
Betreff: Re: [squid-users] ##palin AW: [squid-users] #Can't access certain webpages
On 27/11/2013 1:00 a.m., Grooz, Marc (regio iT) wrote:
> In my first case:
>
> Squid request:
>
> -MGET
> /cgi-bin/upload_status.cgi?uid=060950223627&files=:iso-27001-router-se
> curity-audit-checklist.xls&ok=1 HTTP/1.1
> Accept: text/html, application/xhtml+xml, */*
>
> Webserver answer:
> [-MHTTP/1.1 200 OK
> Date: Mon, 25 Nov 2013 12:48:57 GMT
>> Squid send the first request again and again.
>
> Direct request without squid:
>
> Gm/GET
> /cgi-bin/upload_status.cgi?uid=318568766743&files=:aukirche.JPG&ok=1
> HTTP/1.1
>
> Webserver answer:
> GmHTTP/1.1 200 OK
>
>> Website gets displayed.
>
Are those "-M" "Gm/" cgaracters really in front of the GET method name and the HTTP/1.1 response version label?
It looks like you may be receiving SOCKS protocol traffic.
Amos
This archive was generated by hypermail 2.2.0 : Tue Nov 26 2013 - 12:00:05 MST