Need a link gatherer coded in python. The purpose of the program is to load a list of URLs from a text file in this format.
[login to view URL]
[login to view URL]
[login to view URL]
Then it will gather the urls base off of the options sent to the program.
The options would be
1) -t (Thread parameter, how many simultaneous connection to run. Example... -t 500
2) -T (timeout parameter in seconds, how to give up checking on that single thread. Example... -T 30
3) -good (Links to be collected, links have to contain this string to be saved. Example... -good .mp3
4) -o (Save to file. Example... -o [login to view URL])
5) -l (List to scan. Example... -l [login to view URL])
So overall execution would look like ./[login to view URL] -t 100 -T 30 -good .mp3 -o [login to view URL] -l [login to view URL]
I can code that in Python.
My priority is the quality and to make the script failproof so it will not fail with strange characters in links.
However if you are sure that the links file will never have problems like malformed links or invalid characters and are willing to take that risk then I can deliver the script much faster.