for command-line downloading of entire websites, there is the "wget",
which can recreate the entirewebsite to a local folder (links and
all). This is just what I have read in the past and what the manual
says:
www.gnu.org/.../wget.html#Overview
I've personally never used it for anything more than downloading a few
select pages I wanted to read while on a plane. Seems llike it could
put quite a strain on the host's servers, and I saw a website with a
tutorial on dowloading entire sites with wget specifically asking that
it not be tested on their site for that reason!