Hey Eliezer,
rblchek.rb said:
You are listed on the following 4 blacklists
cbl.abuseat.org
dnsbl-1.uceprotect.net
dnsbl.dronebl.org
dnsbl-1.uceprotect.net
As I understand it can be the root of my problem))))
Дмитрий Шиленко писал 03.07.2014 21:45:
> Sorry,I accidentally made a mistake))
> 
> I just can not understand - I !!USE!! only http proxy, and the search engine 
> Google is working on https protocol. What's the connection between them?))))
> 
> Eliezer Croitoru писал 03.07.2014 21:15:
>> Hey Dmitry,
>> 
>> Sometimes it's because there is a sign of a proxy in the middle in the 
>> headers or that Google Thinks that there is too much traffic for you 
>> network and which is not typical.
>> I do not know how google can allow identification of a network behind a 
>> proxy but it seems to me that adding some headers are the right direction.
>> 
>> You can use the "via off" option and also the removal of x-forward-for 
>> Headers on the proxy using:
>> http://www.squid-cache.org/Doc/config/forwarded_for/
>> with "delete".
>> 
>> it a basic try and find out..
>> and also fill this form:
>> https://support.google.com/websearch/contact/ban
>> 
>> One thing that I think you should check is the PTR record for you IP and 
>> also RBL checks on your IP.
>> You can use the tool I have modified at:
>> http://www1.ngtech.co.il/rbl/rblcheck.rb
>> 
>> You can try to add some google apps headers that restricts some access to 
>> google apps and which might block some traffic which should not be there in 
>> the first place(as a test only to see if it helps)
>> 
>> Eliezer
>> 
>> 
>> On 07/03/2014 03:43 PM, Дмитрий Шиленко wrote:
>>> Hello. I am using SQUID 3,3,8 transparently on opensuse 13.1. Here is
>>> the configuration:
>>> 
>>> visible_hostname koreamotors.com.ua
>>> acl localnet src 192.168.0.0/24 # RFC1918 possible internal network
>>> acl SSL_ports port 443
>>> acl Safe_ports port 80          # http
>>> acl Safe_ports port 21          # ftp
>>> acl Safe_ports port 443         # https
>>> acl Safe_ports port 70          # gopher
>>> acl Safe_ports port 210         # wais
>>> acl Safe_ports port 1025-65535  # unregistered ports
>>> acl Safe_ports port 280         # http-mgmt
>>> acl Safe_ports port 488         # gss-http
>>> acl Safe_ports port 591         # filemaker
>>> acl Safe_ports port 777         # multiling http
>>> acl CONNECT method CONNECT
>>> 
>>> #custom Global-it's ACL's
>>> acl AdminsIP src "/etc/squid/AccessLists/AdminsIP.txt"
>>> acl RestrictedDomains dstdomain
>>> "/etc/squid/AccessLists/RestrictedDomains.txt"
>>> acl ad_group_rassh urlpath_regex -i
>>> "/etc/squid/AccessLists/rasshirenie.txt"
>>> 
>>> http_access allow localhost
>>> http_access deny !Safe_ports
>>> # Deny CONNECT to other than SSL ports
>>> http_access deny CONNECT !SSL_ports
>>> 
>>> #custom Global-it's settings
>>> http_access allow AdminsIP
>>> http_access deny RestrictedDomains
>>> http_access deny ad_group_rassh
>>> http_access allow localnet
>>> http_access deny all
>>> 
>>> icp_access allow localnet
>>> icp_access deny all
>>> 
>>> http_port 192.168.0.97:3128
>>> http_port 192.168.0.97:3129 intercept
>>> 
>>> cache deny all
>>> 
>>> # Add any of your own refresh_pattern entries above these.
>>> #
>>> refresh_pattern ^ftp:           1440    20%     10080
>>> refresh_pattern ^gopher:        1440    0%      1440
>>> refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
>>> refresh_pattern .               0       20%     4320
>>> . As soon as the squid and wrap it traffic, Google immediately starts to
>>> request captcha. What should I do to solve this problem?
>>> 
>>> Dmitry
-- С ув. Шиленко Дмитрий Системный инженер global-it.com.ua моб. (063)142-32-59 офис 221-55-72Received on Thu Jul 03 2014 - 19:39:08 MDT
This archive was generated by hypermail 2.2.0 : Fri Jul 04 2014 - 12:00:05 MDT