On 1/02/2014 4:23 a.m., MARTIN, PETER CCS wrote:
> Hi,
>
> I'm not too sure what the terminology for this is! I'm wondering if
> Squid can function in a mode where it acts like a web server, and you
> pass it URL's as a querystring, and it connect you out to the site?
It can. It's called URL re-writing and is generally a very bad idea in
the current web since the client and server each think they are
presenting resources for a different domain. If any part of the HTTP
transaction heaser, or HTML content or scripts or DOM tree of the client
is generated with the server's view of the domain things screw up
immediately.
>
> My issue is that we have a corporate proxy, and I need to bypass for
> some sites (moving list, don't want to have to keep it updated).
Laziness is no excuse to break the Internet for the proxy users.
If you want users to bypass the proxy controls this easily, simply let
them past in the first place.
OR, provide them with the tools necessary to keep the list updated
instead of doing it yourself. You could have the list in a DB, an
external_acl_type helper to check it in real-time, a user-visible form
to add entries to the DB and a "unlist list" button on the block error page.
> The
> remote sites firewalls require us to come in from a specific source
> address, and so I thought if I could setup Squid and make the source
> URL for the sites in the browser have a prefix of it may work?
It would work far better having the browser configured to use a which
has a static IP and simply relays traffic to the main proxy. Proxy
chaining is what that is called.
Amos
Received on Sat Feb 01 2014 - 00:48:23 MST
This archive was generated by hypermail 2.2.0 : Sat Feb 01 2014 - 12:00:06 MST