I have been told that the members of Squid mailing list have done
various hacks to Geturl, a program I have written to make easy
downloading of data through different web protocols.
Does anyone have an archive of these patches, since I would like to
encorporate them to the next release? BTW, I plan to release Geturl
1.4 in a few weeks, which will contain many new and useful features.
If you would like a feature you could use in Squid, or which Squid
users would like, I will be more than happy to help. Mail me at
hniksic@srce.hr, or to this list, since I have subscribed.
Geturl 1.4 will be distributed under the terms of the General Public
License, like Squid itself.
Here is the relevant part of the NEWS file, between 1.3 and 1.4:
========================
Geturl 1.4 is an extensive rewrite of Geturl. Although many things
look suspiciously similar, most of the stuff was rewritten, like
recursive retrieval, HTTP, FTP and mostly everything else. Geturl
should be now easier to debug, maintain and, most importantly, use.
News in Geturl 1.4:
* Recursive HTTP should now work without glitches, even with Location
changes, server-generatied indexes and other naughty stuff.
* HTTP regetting is supported on servers that support Range
specification. WWW authorization is supported -- try
--http-user=username and --http-passwd=password.
* FTP support was rewritten and widely enhanced. Globbing should now
work flawlessly. Symbolic links are created locally. All the
information the Unix - style ls listing can give is now used.
* Recursive FTP is supported, e.g.
geturl -r ftp://gnjilux.cc.fer.hr/pub/unix/util/
* Time-stamping is supported, with both HTTP and FTP. Try geturl -N URL.
* A new texinfo reference manual is provided. It can be converted to
info, dvi, HTML or postscript, as needed.
* Fixed a long-standing bug, so that Geturl works over SLIP connections.
* You can have a system-wide geturlrc (/usr/local/lib/geturlrc by
default). Settings in $HOME/.geturlrc override the global ones, of
course :-)
* You can set up quota in .geturlrc to prevent sucking too much
data. Try `quota = 5M' in .geturlrc (or quota = 100K if you want
your sysadmin to like you).
* Download rate in kilobytes/seconds is printed.
* Geturl now sends the Referer: header when retrieving
recursively.
* HTML parser, as well as the whole of Geturl was rewritten to be much
faster and less memory-consuming.
* Absolute links can be converted to relative links locally. Check
geturl -k.
* Geturl catches hangup, filtering the output to a log file and
resuming work. Try kill -HUP %?geturl.
* User-defined headers can be sent. Try
geturl http://fly.cc.her.hr/ --header='Accept-Charset: iso-8859-2'
* Acceptance/Rejectance lists may contain wildcards.
* socks library is now supported (thanks to Antonio Rosella
<Antonio.Rosella@agip.it>). Configure with --with-socks.
* There is a nicer display of REST-ed output.
* Many new options (like -x to force directory hierarchy, or -m to
turn on mirroring options).
* Geturl is now distributed under GNU General Public License (GPL).
* Lots of small features I can't remember. :-)
* A host of bugfixes.
-- } WWW: World-Wide-Waste. Waste management corporation, which } handles the billions of tons of garbage generated by just } about everybody these days. } You owe the Oracle a good book. In HyperText, please.Received on Tue Oct 15 1996 - 09:16:58 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:33:17 MST