Hi,

I suspect this question will not have as elegant an answer as I'd like but I was wondering if anyone in this mailing lists knows an easy (like easy enough for someone with as little programming understand as myself) automated way of downloading all the files in a category or an article on a Wikimedia project/Wikia site. I am running windows 7 (64 bit) SP1 if it is relevant, although I do have access to a Linux command line (namely cygwin 64 bits) if needed.

I have seen |http://how-to.wikia.com/wiki/How_to_download_all_image_files_in_a_Wikimedia_Commons_page_or_directory| but it didn't help me as at the |WIKI_LINKS| step I received an error that there was no such page on Wikimedia Commons.

If you would like a specific example to work with (i.e., a specific category to write a code to download from) try the Wikimedia Commons category Histopathology (|https://commons.wikimedia.org/wiki/Category:Histopathology|). Dw I know this download will take a while due to the number and size of the files involved.

Thanks for your time,
Brenton
_______________________________________________
MediaWiki-l mailing list
To unsubscribe, go to:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Reply via email to