Hello Amos,
I checked the documentation, it is working fine when I match only the domain.
acl forbiddenURLs url_regex -i "/etc/squid/forbiddenURL.txt"
http_access deny forbiddenURLs
Any domain name I put in the forbiddenURL.txt is working fine.
for example: .example.com can block everything for that domain.
It is not working when the request is redirected to the HTTPS page
(home.php) after the login
page of that domain when I modify the expression to include
".example.com/home.php".
Is there any way I can validate my acl statements against the URL ?
(I want to know if I am doing it correctly).
Regards
Supratik
On Thu, Jul 14, 2011 at 6:07 PM, Amos Jeffries <squid3_at_treenet.co.nz> wrote:
> On 15/07/11 00:17, Supratik Goswami wrote:
>>
>> Is there a way to create an acl in such a way that I can only block few
>> pages
>> from that domain ?
>>
>> Example: If there is a domain named example.com, I will allow all
>> pages except when
>> it matches the following in the URL.
>>
>> example.com/home.php
>> example.com/home.php#!/profile.php
>>
>>
>> Regards
>>
>> Supratik
>
> Introducing the access control lists (ACL) documentation:
> http://wiki.squid-cache.org/SquidFaq/SquidAcl
> http://www.squid-cache.org/Doc/config/acl/
> http://www.squid-cache.org/Doc/config/http_access/
>
> NOTE: the #... part is called a page fragment and is completely internal to
> the web browser. The second one is never seen by Squid.
>
>
> Amos
> --
> Please be using
> Current Stable Squid 2.7.STABLE9 or 3.1.14
> Beta testers wanted for 3.2.0.9
>
Received on Thu Jul 14 2011 - 13:26:39 MDT
This archive was generated by hypermail 2.2.0 : Thu Jul 14 2011 - 12:00:02 MDT