Crawl a website, save all files locally. Very simple interface. Runs locally, not a cloud service. Own your own data. Options to preserve all files or to process them to allow browsing of the local copy.
Operating System: Mac OS X
- Crawl a website, save all files locally. - Very simple interface - simply enter your page url and press start.
When the crawl finishes, you'll see a save dialog. - Powerful crawl settings, allowing for rate limiting, black / white listing, setting of user-agent string (spoofing) and more. - Runs locally, not a cloud service. Own your own data. - Options to preserve all files exactly as they were fetched under their original filenames - Or to process them to allow browsing of the local copy
Version 0.4.0: Improvements to engine meaning that certain sites will display properly locally after being saved with the 'process' option Updates the selectable user-agent strings and adds more Changes default setting for treating http:// links on the same domain (when starting with an https:// url).
Version 0.3.0: adds 'single page' option Adds option to archive all files from a website Improvements to crawling engine; now finds and processes image urls within inline styles