Get all links from an HTML page
Brought to you by:
sylvaintv
If DownloadPlus is submitted an HTML page, it could
parse the page for links, and add all the URLs of the
links to the files to download.
Further options, such as filtering by wildcard match
(*.zip for example), a menu where one can choose which
links to add and which not to add, and ability to
recurse if another HTML page is found (configurable
number of levels of recursion) would be nice
Logged In: YES
user_id=851379
Going recurvisely through a site and downloading every files
can be dangerous and it's quite hard to do actaully
Though I can implement the feature "download every file on
the page" or "download everything in a ftp directory".
I will work on it probably on these Christmas holidays.
Thanks for the advice.