You can use the generator=allpages option, but there is a limit to how many at once. Something like https://www.mediawiki.org/w/api.php?action=query&export&generator=allpages&gaplimit=max&exportnowrap=1
If it is at all an option, i would suggest the command line dumpBackup.php script - https://www.mediawiki.org/wiki/Manual:DumpBackup.php -- Brian On Tuesday, July 4, 2023, JFCM <[email protected]> wrote: > Hi! > I hope this is the right list to ask. If not apologies. > > I wish to use the api.php copmmand to download all the txt of a wiki of > mine. Must I run it on a per page basis, or is there a parameter for "every > pages". Thank you a lot. > jfc > _______________________________________________ > MediaWiki-l mailing list -- [email protected] > To unsubscribe send an email to [email protected] > https://lists.wikimedia.org/postorius/lists/mediawiki-l.list > s.wikimedia.org/ >
_______________________________________________ MediaWiki-l mailing list -- [email protected] To unsubscribe send an email to [email protected] https://lists.wikimedia.org/postorius/lists/mediawiki-l.lists.wikimedia.org/
