Hey thanks for your reply.
On Tue, May 22, 2012 at 10:40 AM, Eliezer Croitoru <eliezer_at_ngtech.co.il> wrote:
> On 20/05/2012 19:47, Jason Voorhees wrote:
>>
>> Hi people:
>>
>> I'm a squid user since long time ago but my skills -I believe- aren't
>> so high to implement some of the feature I'm asking for in this
>> e-mail.
>>
>> In a university there are 6000-8000 users (they are divided in a big
>> campus through different VLANs, offices even metro-ethernet connected
>> branchs) browsing Internet through two lines of 80 and 70 mbps.
>> Currently there's a fortinet appliance doing the labor of web
>> filtering with some interesting feature I'd like to implement with
>> Squid too. These are the pros and cons about fortinet:
>>
>> cons
>> ====
>> - It doesn't have a cache (at least not an effective one)
>> - When fortinet implement too much bandwidth rules (something like
>> squid delay pools) it begins to work slowly and the browsing becomes
>> slow too.
>>
> squid can implement both of them but it depends on the hardware that is
> hosting squid.
> basic 4 cores with 8gb ram can basically do the job for you.
> the users are not much of measurement size but a requests per second and
> bandwidth throughput together.
>
>
I'll keep in mind your hardware recomendations, but now you mention 4
cores for Squid I remember an old doubt: Can Squid really be
benefitted with more CPU cores? A long time ago I remember I asked
about Squid running with a SMP kernel in a server with two processors
(sockets) but someone told me that Squid wasn't prepared to use more
than 1 processor so I wouldn't note difference between using 1 or 2
processors. Do this apply to cores too?
>
>> pros
>> ====
>> - It has a feature to transparently block https websites. The fortinet
>> admin told me that only for blocked webpages users get a warning of a
>> incorrect certificate (a fortinet digital certificated) but for
>> allowed websites users don't get any warning of failing digital
>> certificates (i don't know if this is true or possible).
>> - Its web filtering its good, it has a up to date database of
>> categorized websites to do an easy blocking.
>>
>> What I plan to do is (or what I'd like to do):
>>
>> - Put Squid in front of fortinet so this one can use squid's cache. I
>> read this is possible using WCCP and some other things.
>> - Squid should work as a replace of fortinet if this one someday
>> fails. So squid is the backup solution to replace fortinet.
>
> it depends on the outgoing ip address and on interception level.
> in basic interception mode you can use fortinet as a cache_peer.
>
>
I really don't know if WCCP is necessary for this scenario, it's
something I just found in a tutorial on Internet. So, just using a
cache_peer configuration I could make fortinet use the squid's cache?
>
>>
>> So to achieve this I think I need:
>>
>> a) Do a good filtering : I was thinking about configure Squid +
>> SquidGuard with a free database, but I have here a simple and basic
>> question: When I use a redirector like Squidguard... all Squid ACLs
>> will definitely stop working? I mean, can I use a redirector and still
>> use my traditional ACLs (acl, http_access, http_reply_access)? Last
>> time I used a redirector with Squid I appreciated that all ACLs
>> weren't even read by Squid so I have this doubt.
>>
> a url_rewrite is what you will use and all the acls will work the same way.
> you can bypass the url_rewrite with acls... so to speak.
>
So if my url redirector (squidguard) and Squid ACLs should work
together, which of those have precedence over the another? Is there
any special setting to make both ACLs (squid and redirector) working?
Do I need to put the url_rewrite directive above/below http_access
directives or something like that?
>
>
>> b) Integrate fortinet with WCCP : I rapidly saw a few tutorials of how
>> to do that but... have you achieve this without problem?
>
> what exactly do you want to achieve by using WCCP? what benefits from that?
>
I really don't know, I just read something about this on Internet.
I'll investigate further before mention about this again.
>
>
>>
>> c) Do transparent https proxy with squid : I tried to use https_port +
>> ssl-bump feature of Squid 3.1 and iptables (REDIRECT 443 port to 3128)
>> without 100% success. I generated my own certificate and that one is
>> the same users get when trying to view some websites (i.e.
>> facebook.com) what is OK but it happened that some websites didn't
>> work as expected: some website loaded OK, some loaded without CSS
>> stylesheets nor images, and some others never loaded (i got the
>> "redirect loop" error in the browser). I wasn't able to build squid
>> 3.2 but I don't know if is necessary to use this version to get this
>> feature of transparent https proxy working.
>
> to use ssl-bump you use a different port then 3128 and specifically for
> ssl-bump.
> there was a bug somewhere that makes a loop like that and i think that the
> cause is redirecting 443 to 3128 instead to ssl-bump port.
> try it again and you will see miracles :]
>
>
Do you mean it was maybe caused by a Squid bug? Or do you mean I maybe
made the iptables redirect incorrectly to http_port instead of the
https_port?
I used Squid 3.1.19 in CentOS 6.2 but now I plan to use squid 3.2
built from source running Debian 6.
I read that Squid 3.2 supports something called "on the fly
certificate creation" or something like that currently not included in
Squid 3.1.* versions. Do you remember which feature is this? Do I
necessarily need this feature to correctly implement the "transparent
https proxing" without problems?
>
>>
>> d) Cache performance : Are there any special squid settings that help
>> me to improve or get the maximum performance of my cache? Is SQuid the
>> best open source solution to implement a powerful cache for my users?
>>
>> I hope someone with an extra free time can help with suggestions,
>> ideas or point me to some articles on Internet about these features.
>
> there are some opensource cache options but squid is the most advanced one
> that i have seen and used.
> it's very simple to config compared to many other solutions that exists and
> even compared to a paid ones.
> for dynamic content you can add an instance of squid2.7satble9 patched to
> cache also youtube and some other sites that wont be cached due to their
> dynamic links behavior,
>
> if you need some more help dont be afraid to ask.
>
I read some forums about a war between Squid and Varnish about
performance with some benchmarks and some of fanaticism :) The
scenario I understood varnish and squid compete is when they are put
in front of a webserver with high load working as a reverse proxy.
However my scenario is not to use Squid as a reverse proxy but to put
it in front of a fortinet appliance to improve Web browsing of
thousand users.
In both scenarios -reverse and direct proxy- do Squid and Varnish use
to compete? Is Varnish only usable in reverse proxy scenarios?
I read a discussion about these two products working as a reverse
proxy where someone claims that Squid works fine with good results but
after some hours of continuous working Squid started to degrade in
performance until the admin was forced to restart squid to solve the
issue.
Is it true this performance issue with Squid? If yes, it was caused by
an already solved bug or by a bad squid configuration?
Thanks
> good luck,
> Eliezer
>
>
>>
>> Thanks
>
>
>
> --
> Eliezer Croitoru
> https://www1.ngtech.co.il
> IT consulting for Nonprofit organizations
> eliezer <at> ngtech.co.il
Received on Fri May 25 2012 - 15:16:14 MDT
This archive was generated by hypermail 2.2.0 : Fri May 25 2012 - 12:00:04 MDT