> The proper way would be to set up Squid on its default port of 3128 or a
> common proxy port like 8080. Then set the proxy variable for wget (something
> like http_proxy=http://squidmachine:3128) so wget will request URLs through
> Squid.
>
> If you're trying to pre-fetch data, you might also be interested in the
> following wget options:
> --delete-after will delete the files after download
> -r does a recursive crawl
> -l for recursion depth
> -nd no directory creation, to speed up the fetching
Thanks,
I completely overlooked the .wgetrc file that let me configure
http_proxy, proxy-user, and proxy-password.
I ran:
wget -nd --delete-after http://www.indo.com/culture/dance_music.html
and the page was loaded into the squid cache, i.e.,
<snip from squid cache using purge>
/usr/local/orsquid/var/cache/00/00/00000012 0 21144
http://www.indo.com/culture/dance_music.html
<snip>
wget returned:
<snip>
100%[===================================================================>]
20,770 36.49K/s
11:35:08 (36.45 KB/s) - `dance_music.html' saved [20770/20770]
<snip>
and then I ran the same command, wget -nd --delete-after
http://www.indo.com/culture/dance_music.html to see if wget would then
puul from cache, and got:
<snip>
100%[===================================================================>]
20,770 --.--K/s
11:36:19 (89.22 MB/s) - `dance_music.html' saved [20770/20770]
<snip>
So I assume that the --.--K/s in the second run indicates that wget did
pull from the squid cache, right?
Thanks,
Murrah Boswell
Received on Tue Feb 10 2004 - 11:49:18 MST
This archive was generated by hypermail pre-2.1.9 : Mon Mar 01 2004 - 12:00:02 MST