Hi,
I'm having a number of problems with Squid at the moment with duplicate
Etags in the headers. I'm using Squid as an accelerator to forward
traffic to Apache, which serves up a Drupal installation.
After roughly 3 days, a number of pages on the site start to fail with
400 Bad Request errors; it starts with just a few and then slowly
spreads to more pages. I did a tcpdump of the requests coming from Squid
to Apache, and Apache is spitting out a 400 error because of the header
size. Hundreds of etags are appearing in the If-None-Match headers
field, which hits Apache's header size limit, causing the error. The
only way I've found to 'fix' this so far is to either:
1. Flush Squid cache entirely
2. Purge the affected pages
But then after a few days the problem comes back again. I've been using
Squid as an accelerator to Drupal installations for years and this
hasn't happened before. I'm using the following version of Squid:
Squid Cache: Version 2.6.STABLE21
which is the latest version available in the CentOS 5 repositories. The
only difference between this installation of Squid/Apache/Drupal and
others which have worked fine in the past is the version of Drupal -
Drupal 7. Supposedly Drupal 7 has significantly altered cache handling,
but I can't work out why this would cause this problem with Squid.
The only thing I can think of at the moment is something to do with
Squid's cache rotation (specifically the LRU functionality), so that
when Squid rotates its cache, something ends up corrupted or malformed.
Any help or suggestions would be much appreciated!
Thanks,
Andy Taylor
Received on Mon Apr 30 2012 - 11:32:17 MDT
This archive was generated by hypermail 2.2.0 : Tue May 01 2012 - 12:00:05 MDT