Logo 
Search:

Unix / Linux / Ubuntu Forum

Ask Question   UnAnswered
Home » Forum » Unix / Linux / Ubuntu       RSS Feeds

command line or software to download hole site

  Date: Jan 21    Category: Unix / Linux / Ubuntu    Views: 537
  

Is there any command line or software that will let me download a hole
website to my local computer?

Share: 

 

19 Answers Found

 
Answer #1    Answered On: Jan 21    

Have a look at GWGet and WebHTTrack. Both in the repositories.

 
Answer #2    Answered On: Jan 21    

WebHTTrack works like a dream...............

 
Answer #3    Answered On: Jan 21    

I use filezilla for FTP downloading and uploading.

http://filezilla-project.org/download.php

 
Answer #4    Answered On: Jan 21    

So do I, but as Ian will know that's a particular type of downloading,
which perhaps isn't what Matt is after. He appears to want to download a
website for offline browsing.

To clarify for anyone who isn't aware of the difference: ftp is used by
website creators and owners. They use it to down- and more particularly
up-load their website between their local computer and a server on the web.

The files you would access with Filezilla or the like are those that go
to build a website: some of them won't look very much like what you see
when you browse the site. For instance, any images will be in separate
files of their own. There may be a database, and pages of program code
such as ASP or PHP that generate web pages when a browser asks for them;
CSS files to specify how a page is laid out; javascript files to add
dynamism to a page; and so on. And you'll almost always need a private
password to access the site for those purposes.

If that is what you want Matt, then I agree Filezilla is an excellent
choice.

 
Answer #5    Answered On: Jan 21    

I build web sites Steve and use Filezilla as well as FrontPage as agree
you can only get the page structure source code and graphics by
downloading pages from the web. You need to have built the site.

 
Answer #6    Answered On: Jan 21    

httrack is a good one and can be found in Synaptic.

 
Answer #7    Answered On: Jan 21    

for command-line downloading of entire websites, there is the "wget",
which can recreate the entirewebsite to a local folder (links and
all). This is just what I have read in the past and what the manual
says:

www.gnu.org/.../wget.html#Overview

I've personally never used it for anything more than downloading a few
select pages I wanted to read while on a plane. Seems llike it could
put quite a strain on the host's servers, and I saw a website with a
tutorial on dowloading entire sites with wget specifically asking that
it not be tested on their site for that reason!

 
Answer #8    Answered On: Jan 21    

A number of websites, particularly those based on wikis, are now
implementing 'burst triggers', which prevent you downloading too many
pages in a given time period, presumably for this reason (amongst others).

This means that such sites cannot be downloaded by conventional
mirroring programs, such as httrack. However, wget has two switches
which can help to lighten the load on the target servers, making a
successful download more likely. The first is --wait=[seconds], which
as the name implies, waits for the specified number of seconds between
requests to the server.

However, some servers are now using log analysis to identify this
approach and blocking the download, so wget also has the --random-wait
switch which gives a random delay between 0.5 and 1.5 times the value
specified in --wait (the two need to be used together).

So far as I am aware, wget is the only program to implement this feature
and so it's the one I would recommend.

 
Answer #9    Answered On: Jan 21    

I'm puzzled by this whole thread. Why would anyone want to download a whole web
site?

 
Answer #10    Answered On: Jan 21    

There are two main reasons.

1) To 'mirror' the site on another server
2) To allow offline browsing of the site if the user requires it in an
area with poor/no internet access.

 
Answer #11    Answered On: Jan 21    

And I will add to that to "archive" it as sometimes web site go away
forever! I have a couple that had some interesting technical content
that I wanted to preserve and am glad I did. When I went to see if
there were any updates about a year later, the site was gone.

 
Answer #12    Answered On: Jan 21    


Why would someone want to download an entire site? Well, one thing is
it could be useful for some online tutorials/manuals, where each new
concept/section is a separate page. There were several instructional
documents like that at my previous job, where I would have definitely
downloaded the entire thing had I been aware of wget and the
possiblity. As it was, I simply spent half a day printing out the
things in their entirety since I had no internet at home and didn't
fancy sitting in my basement office all night trying to catch up with
what I was already supposed to know. Same goes for a number of
javascript and html tutorials I was using to teach myself with back in
the late 90's-early 00's... I had a delightful little dialup
connection back then, and would have much rather fetched everything to
my local disk while I slept easy at night.

 
Answer #13    Answered On: Jan 21    

Indeed I download web tutorials for off-line viewing , such that I can
combine with on-line working.

 
Answer #14    Answered On: Jan 21    

Interesting. It never would have occurred to me to do such a thing. I'd think
you might want to be careful of copyright issues, tho.

 
Answer #15    Answered On: Jan 21    


He better not download my web site pages to use because I build them for
clubs and they would be after him for sure.

 
Answer #16    Answered On: Jan 21    

Going after one that I have up that
can't help Tech support to reset my password. I lost my backups for a
fire...

 
Answer #17    Answered On: Jan 21    

What host site do you use? Surely they have a password reset or at
least a support link to offer advice on lost passwords. I would never
use a host that did not offer such help.

 
Answer #18    Answered On: Jan 21    

Because you may want to edit it all on your computer.


 
Answer #19    Answered On: Jan 21    

Best answer I can think of is off-line viewing or archiving.
This (mostly?) assumes the downloader is not the author.
The author should already have local back-ups of all the files.

A second choice would be editing, (if relative links were used in creation).

 
Didn't find what you were looking for? Find more on command line or software to download hole site Or get search suggestion and latest updates.




Tagged: