On Mon, Mar 26, 2012 at 5:45 PM, Amos Jeffries <squid3_at_treenet.co.nz> wrote:
> On 27.03.2012 10:13, Carlos Manuel Trepeu Pupo wrote:
>>
>> On Sat, Mar 24, 2012 at 6:31 PM, Amos Jeffries <squid3_at_treenet.co.nz>
>> wrote:
>>>
>>> On 25/03/2012 7:23 a.m., Carlos Manuel Trepeu Pupo wrote:
>>>
>>>> On Thu, Mar 22, 2012 at 10:00 PM, Amos Jeffries wrote:
>>>>>
>>>>>
>>>>> On 23/03/2012 5:42 a.m., Carlos Manuel Trepeu Pupo wrote:
>>>>>>
>>>>>>
>>>>>> I need to block each user to make just one connection to download
>>>>>> specific extension files, but I dont know how to tell that can make
>>>>>> just one connection to each file and not just one connection to every
>>>>>> file with this extension.
>>>>>>
>>>>>> i.e:
>>>>>> www.google.com #All connection that required
>>>>>> www.any.domain.com/my_file.rar #just one connection to that file
>>>>>> www.other.domain.net/other_file.iso #just connection to this file
>>>>>> www.other_domain1.com/other_file1.rar #just one connection to that
>>>>>> file
>>>>>>
>>>>>> I hope you understand me and can help me, I have my boss hurrying me
>>>>>> !!!
>>>>>
>>>>>
>>>>>
>>>>> There is no easy way to test this in Squid.
>>>>>
>>>>> You need an external_acl_type helper which gets given the URI and
>>>>> decides
>>>>> whether it is permitted or not. That decision can be made by querying
>>>>> Squid
>>>>> cache manager for the list of active_requests and seeing if the URL
>>>>> appears
>>>>> more than once.
>>>>
>>>>
>>>> Hello Amos, following your instructions I make this external_acl_type
>>>> helper:
>>>>
>>>> #!/bin/bash
>>>> result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c "$1"`
>>>> if [ $result -eq 0 ]
>>>> then
>>>> echo 'OK'
>>>> else
>>>> echo 'ERR'
>>>> fi
>>>>
>>>> # If I have the same URI then I denied. I make a few test and it work
>>>> for me. The problem is when I add the rule to the squid. I make this:
>>>>
>>>> acl extensions url_regex "/etc/squid3/extensions"
>>>> external_acl_type one_conn %URI /home/carlos/script
>>>> acl limit external one_conn
>>>>
>>>> # where extensions have:
>>>>
>>>>
>>>>
>>>> \.(iso|avi|wav|mp3|mp4|mpeg|swf|flv|mpg|wma|ogg|wmv|asx|asf|deb|rpm|exe|zip|tar|tgz|rar|ppt|doc|tiff|pdf)$
>>>>
>>>> http_access deny extensions limit
>>>>
>>>>
>>>> So when I make squid3 -k reconfigure the squid stop working
>>>>
>>>> What can be happening ???
>>>
>>>
>>>
>>> * The helper needs to be running in a constant loop.
>>> You can find an example
>>>
>>>
>>> http://bazaar.launchpad.net/~squid/squid/3.2/view/head:/helpers/url_rewrite/fake/url_fake_rewrite.sh
>>> although that is re-writer and you do need to keep the OK/ERR for
>>> external
>>> ACL.
>>
>>
>> Sorry, this is my first helper, I do not understand the meaning of
>> running in a constant loop, in the example I see something like I do.
>> Making some test I found that without this line :
>> result=`squidclient -h 192.168.19.19 mgr:active_requests | grep -c "$1"`
>> the helper not crash, dont work event too, but do not crash, so i
>> consider this is in some way the problem.
>
>
>
> Squid starts helpers then uses the STDIN channel to pass it a series of
> requests, reading STDOUt channel for the results. The helper once started is
> expected to continue until a EOL/close/terminate signal is received on its
> STDIN.
>
> Your helper is exiting without being asked to be Squid after only one
> request. That is logged by Squid as a "crash".
>
>
>>
>>>
>>> * "eq 0" - there should always be 1 request matching the URL. Which is
>>> the
>>> request you are testing to see if its >1 or not. You are wanting to deny
>>> for
>>> the case where there are *2* requests in existence.
>>
>>
>> This is true, but the way I saw was: "If the URL do not exist, so
>> can't be duplicate", I think isn't wrong !!
>
>
> It can't not exist. Squid is already servicing the request you are testing
> about.
>
> Like this:
>
> receive HTTP request -> (count=1)
> - test ACL (count=1 -> OK)
> - done (count=0)
>
> receive a HTTP request (count-=1)
> - test ACL (count=1 -> OK)
> receive b HTTP request (count=2)
> - test ACL (count=2 -> ERR)
> - reject b (count=1)
> done a (count=0)
With your explanation and code from Eliezer Croitoru I made this:
#!/bin/bash
while read line; do
result=`squidclient -h 192.168.19.19 mgr:active_requests | grep
-c "$line"`
echo "$line" >> /home/carlos/guarda # -> Add this line to
see in a file the $URI I passed to the helper
if [ $result -eq 1 ] # ->
With your great explain you made me, I change to "1"
then
echo 'OK'
else
echo 'ERR'
fi
done
It's look like it's gonna work, but, here another miss.
1- The "echo "$line" >> /home/carlos/guarda" do not save anything to the file.
2- When I return 'OK' then in my .conf I can't make a rule like I
wrote before, I have to make something like this: "http_access deny
extensions !limit", in the many helps you bring me guys, I learn that
the name "limit" here its not functional. The deny of "limit" its
because when there are just one connection I cant block the page.
3- With the script just like Eliezer tape it the page with the URL to
download stay loading infinitely.
So, I have less work, can you help me ??
>
>
>>
>>>
>>> * ensure you have manager requests form localhost not going through the
>>> ACL
>>> test.
>>
>>
>> I was making this wrong, the localhost was going through the ACL, but
>> I just changed !!! The problem persist, What can I do ???
>
>
> which problem?
>
>
> Amos
Received on Tue Mar 27 2012 - 15:28:02 MDT
This archive was generated by hypermail 2.2.0 : Tue Mar 27 2012 - 12:00:04 MDT