From: Amos Jeffries <squid3_at_treenet.co.nz>
> Nick Cairncross wrote:
> > This is just a curiosity (whilst I have some time on my hands) and not
>something I want to put into a live environment.
> > I once stumbled across a site which offered a program/plug-in to scan the
>access.log file and watch for .jpg .gif etc images. These image links were then
>pulled from the log and then populated onto a constantly refreshing webpage to
>provide a sort of 'mosaic' of images being viewed live. It
>sounded...interesting, but I've never been able to find it again. I wondered if
>anyone has seen such a thing or developed their own.
>
> Haven't heard of that one myself. But its trivial these days to write a daemon
>script that accepts the log lines directly as squid generates them and does
>things like that real-time.
A quick and dirty untested shell script:
#!/bin/bash
WALLFILE=images_wall.html
LOGFILE=/var/log/squid/access.log
NB_IMG=50
SLEEP=10
while :
do
echo "<HTML>" > $WALLFILE.tmp
echo " <HEADER>" >> $WALLFILE.tmp
echo " <TITLE>Images Wall</TITLE>" >> $WALLFILE.tmp
echo " <META HTTP-EQUIV='REFRESH' CONTENT='$SLEEP'>" >> $WALLFILE.tmp
echo " </HEADER>" >> $WALLFILE.tmp
echo " <BODY>" >> $WALLFILE.tmp
awk ' { print $7 } ' $LOGFILE |
grep "jpg$\|gif$" |
tail -$NB_IMG |
while read URL
do
echo " <IMG SRC='$URL'>" >> $WALLFILE.tmp
done
echo " </BODY>" >> $WALLFILE.tmp
echo "</HTML>" >> $WALLFILE.tmp
mv -f $WALLFILE.tmp $WALLFILE
sleep $SLEEP
done
Needs to be adapted to your access log format (the $7 in awk is the 7th (url)
column in the logfile lines)...
You can also put it in cron and remove the main while loop and the sleep.
JD
Received on Wed Jul 28 2010 - 15:40:13 MDT
This archive was generated by hypermail 2.2.0 : Thu Jul 29 2010 - 12:00:04 MDT