Spoke with the boss. He said 'give us a week to make sure things are
stable, first'. I can understand that. Things got very hairy there for a
while. So...allow one week for better access. In the meantime, I'll
arrange for the collector to get run. Tomorrow, though. Tonight I have
to sleep.
D
Henrik Nordstrom wrote:
>
> Dancer wrote:
>
> > I can arrange a user-name/password....or to get a specific address wired
> > into the configs for digest fetching in a pinch (gotta clear that with
> > the boss)
>
> Any method is fine, or even FTP access for that matter (actually
> preffered since I am not on line). What is needed two or three
> successive digests from the same cache.
>
> I would be very pleased if someone could provide FTP access (or HTTP) to
> some hours worth of digests, preferably paired with digest statistics.
>
> -- collector script, to be run once/hour during collection --
>
> #!/bin/sh
> proxy=proxy.some.domain
> port=3128
> date=`date +%Y%m%d%H%M`
> client=/usr/local/squid/bin/client
> outdir=/home/ftp/priv/digests/
>
> $client -h $proxy -p $port mgr:digest_stats >$outdir/$proxy-$date.stats
> $client -h $proxy -p $port /squid-internal-periodic/store_digest
> >$outdir/$proxy-$date.digest
>
> -----
>
> /Henrik
Received on Tue Jul 29 2003 - 13:15:59 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:12:16 MST