Fli
02-21-2015, 09:49 PM
Hi, is there any Linux/Windows software (preferably opensource) which allows creating offline browsable copy of a website?
I mean i can select a domain and it will save its subpages for offline browsing...
So far i have experience with:
1) httrack website copier (http://www.httrack.com/) , it really works, but i dont like it as i always must go thru its not easy configuration
2) using Google Search and after webpage URL, click arrow and select something like Archive or Cache and google may return cached version of that webpage.
3) Install Google Chrome extension like Read It Later (https://chrome.google.com/webstore/detail/read-it-later/aaocbkeamabaniccpnbapflopmcnpjbg) , it ads an icon with a list of saved webpages
I mean i can select a domain and it will save its subpages for offline browsing...
So far i have experience with:
1) httrack website copier (http://www.httrack.com/) , it really works, but i dont like it as i always must go thru its not easy configuration
2) using Google Search and after webpage URL, click arrow and select something like Archive or Cache and google may return cached version of that webpage.
3) Install Google Chrome extension like Read It Later (https://chrome.google.com/webstore/detail/read-it-later/aaocbkeamabaniccpnbapflopmcnpjbg) , it ads an icon with a list of saved webpages