Grabbing An Entire Website
xandy at tux.org
Wed Apr 19 16:48:44 EDT 2000
You probably want to look into wget. It can follow links to recursively
retrieve all documents referenced by an http URL (also does ftp, but you
specifically said your needs were http). There are a lot of options (e.g.
maximum depth, spanning hosts, converting absolute links to relative ones
locally, etc.), so I suggest reading the man page and then asking more
specific questions if you have them.
On Wed, 19 Apr 2000, Janina Sajka wrote:
> Anyone know how to auto-retrieve an entire www page hierarchy?
> I know software like ncftp can and wuftp can tar up an entire directory
> tree, but the pages I need aren't available over ftp, only http. I'd hate
> to have them by hand one at a time, though.
More information about the Speakup