Hi, is there any Linux/Windows software (preferably opensource) which allows creating offline browsable copy of a website?
I mean i can select a domain and it will save its subpages for offline browsing...

So far i have experience with:

1) httrack website copier , it really works, but i dont like it as i always must go thru its not easy configuration

2) using Google Search and after webpage URL, click arrow and select something like Archive or Cache and google may return cached version of that webpage.

3) Install Google Chrome extension like Read It Later , it ads an icon with a list of saved webpages