There are many websites that have complicated, non-printable webpages. You cannot even save some of the webpages on certain websites. There is one solution to such problems. WEbReaper - This software is a web crawler or a spider, which can work its way through a website, downloading pages, pictures and objects that it finds so that they can be viewed locally, without needing to be connected to the internet.
We get number of websites on any topic through search engines. On a certain topic I usually download 3 to 4 full websites with the help of WebReaper, then on my free time I explore all of them together, even if I don't have internet connection at that time. This way I can manage getting all the information on any topic I look for.
What WebReaper actually does is, it stores the sites locally as a fully-browsable websites which can be viewed with any browser, such as Internet Explorer, NetScape, Opera etc. Also the sites can be saved into the Internet Explorer Cache and viewed using IE's Offline Mode.
Some of the key-features WebReaper have are listed below: Source
We get number of websites on any topic through search engines. On a certain topic I usually download 3 to 4 full websites with the help of WebReaper, then on my free time I explore all of them together, even if I don't have internet connection at that time. This way I can manage getting all the information on any topic I look for.
What WebReaper actually does is, it stores the sites locally as a fully-browsable websites which can be viewed with any browser, such as Internet Explorer, NetScape, Opera etc. Also the sites can be saved into the Internet Explorer Cache and viewed using IE's Offline Mode.
Some of the key-features WebReaper have are listed below: Source
- Multithreaded downloading
- Explorer-style interface
- ShockWave Flash support - downloads/fixes up SWF movies for local browsing
- User-customisable filters - limit by depth, time per object, total time and 'distance' from starting server, and many others.
- Simple-to-use filter wizard - helps you build complex filters quickly and easily.
- Full Drag & Drop support - drag links to/from Internet Explorer/Netscape.
- Save downloaded files using relative paths to recreate websites stored locally with links adjusted to make them fully browsable.
- 'Resume' mode reads files saved locally to avoid reloading unchanged pages
- Proxy & website authentication, allowing websites with passwords or behind firewalls to be reaped.
- 'URL Profiles' allow depth and filter configurations to be saved with associated URLs for easy re-reaping in future.
- Command-Line execution to run as a batch process, using a task scheduler (not provided).
- Works with GetRight® for large file downloads.
0 comments:
Post a Comment