Have you ever fall into the situation that you need to search for some important information and your internet is down ? or you have found any website or blog really informative, you want to read whole site but you can’t pay to cyber cafes for that much time, so you are at right place, here is the solution for you.
There are very popular tools available to save whole website for offline reading.
- HTTrack : It’s really light weight tool (Approx. 4MB). It will scan and download each and every page of any website for offline reading.
- Linux Based Command called "wget" : It is expert in caching whole content of webpage.
The syntax for that command is :wget --mirror -p --html-extension --convert-links www.example.com
where,-p = all images, etc. needed to display HTML page.
This command is used in Linux/Unix Based system in terminal, to know more about this command please checkout : Wget Manual.
--mirror = turns on recursion and time-stamping, sets infinite recursion
depth. and keeps FTP directory listings
--html-extension = save HTML docs with .html extensions
--convert-links = make links in downloaded HTML point to local files. - Internet Download Manager (IDM) : Yes, It’s very popular downloading tool, everybody uses it today, but doesn’t know It’s functionality. It has small option called grabber. You can’t even imagine it’s capability at that level. That is what I’m going to tell you, How to use it right now to cache any complete website for reading it offline. It is effective alternative of HTTrack.