Hi there again folks,
I wonder if any of you have any favourite web archivers that allow you to download whole websites and save them. I used many of them and currently I am using WinHTTrack website copier, which is not bad, but it could be better in some cases.
So - can you recommend me any please?
SFTP + 7zip?
Why do you want to download a whole website?
Why wouldn't you want to download a whole website?
Right click, "view page source".
Then, Ctrl+A, Ctrl+C.
Then open up your favorite HTML editor, Ctrl+V, File->Save.
You may have to do this with every single page on the site, and you may not end up getting everything, such as any external VB/Java scripts, etc. But it's a start.
Google is your friend (http://soft.get.ge/softuncat-29.html).
Quote from: dasjoe on Sat 05/04/2008 02:16:05
Why wouldn't you want to download a whole website?
Dunno... Coz you can bring it up in your browser?
First - I sometimes want to download sites because they are disappearing (for example walkthroughs for Thief 1 game or this kind of stuff). Or I want to read them somewhere where I do not have internet. Or anything.
Of course its easy to save just one page, but this kind of software allows you to direct it to jump on hypertexts and download everything on that server or less.
I am pretty surprised that people dont use it.
AGA - yeah, I can do that too, I just wanted to know if anyone has a good one that he/she can recommend. You know, someone "involved".
Looking back it sounds kind of insane to download websites, but I learned that it is a good habit for rare ones or any of them you want to archive (I once downloaded an article on JFK assasination which was about 100 html pages with no problem and I can read that anywhere, I do this with music lyrics sometimes, there are cases I need it...).
For a single page I've used File -> Save Page As in my browser and for whole websites I've used wget.