Greetings
We have an unusual problem.
One of our clients uses a program to request map data from one of our 
servers.  We have a reverse proxy squid cache in front of it.  Client 
hits squid, which gets data from web server and serves it back to client.
Here's the problem.  Client uses broken code that times out before the 
map data is finished processing.  It then resubmits the request.  It'll 
do this about 10 times before finally dying.  Each new request, however, 
also times out so nothing is done and lots of time is wasted (as single 
one of these requests might take 5 minutes to return).
So, I've proposed that we use url_rewrite_program to pass the request to 
  a program which makes the request to the webserver (and DOESN'T 
timeout!), it then returns the same URL but by this time the object is 
in the cache and the original squid process returns the data from the cache.
Is this craziness?  Anyone do anything like this before?  Or is there 
some better, easier way to handle this?
Thanks
JR
Received on Fri May 29 2009 - 21:59:37 MDT
This archive was generated by hypermail 2.2.0 : Sat May 30 2009 - 12:00:02 MDT