Matteo wrote:
srcdata = urlopen(url).read() dstfile = open(path,mode='wb') dstfile.write(srcdata) dstfile.close() print("Done!")Have you tried reading all files first, then saving each one on the appropriate directory? It might work if you have enough memory, i.e. if the files you are downloading are small, and I assume they are, otherwise it would be almost useless to optimize the code, since the most time consuming part would always be the download. Anyway, I would try and time it, or timeit. ;) Anyway, opening a network connection does take some time, independent of the size of the files you are downloading and of the kind of code requesting it, you can't do much about that. If you had linux you could probably get better results with wget, but that's another story altogether.
If your net connection is working at its maximum then there's nothing you can do to speed up the downloads. If it's the response time that's the problem then you could put the tuples into a queue and run a number of threads, each one repeatedly getting a tuple from the queue and downloading, until the queue is empty. -- http://mail.python.org/mailman/listinfo/python-list
