Alex Rousskov wrote
> It is possible to avoid caching duplicate content, but that allows you
> to handle cache hits more efficiently. It does not help with cache
> misses (when the URL requested by the client has not been seen before).
>
> If content publishers start publishing content checksums and browsers
> automatically add those checksums to requests, then you would have the
> Utopia you dream about :-). This will not happen while content
> publishers benefit from getting client requests more than they suffer
> from serving those requests.
I mean the contents which Squid is aware of them , like contents which Squid
accessed until now .
-- View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/Automatic-StoreID-tp4665140p4666002.html Sent from the Squid - Users mailing list archive at Nabble.com.Received on Mon May 19 2014 - 07:02:37 MDT
This archive was generated by hypermail 2.2.0 : Mon May 19 2014 - 12:00:05 MDT