I kinda don't get why would you reindex all data at once? Each document has unique id.... you will reindex only whats needed. Also if too many stuff I'd suggest using some batch processor that will add N tasks with range query 1:10 10:20 etc... and cronjob executing those. Thousends seems ok but when you hit millions you're in trouble. Cheers.
Ryan McKinley wrote: > I don't know of any standard export/import tool -- i think luke has > something, but it will be faster if you write your own. > > Rather then id:[* TO *], just try *:* -- this should match all > documents without using a range query. > > > On Jan 25, 2009, at 3:16 PM, Ian Connor wrote: > >> Hi, >> >> Given the only real way to reindex is to save the document again, >> what is >> the fastest way to extract all the documents from a solr index to resave >> them. >> >> I have tried the id:[* TO *] trick however, it takes a while once you >> get a >> few thousand into the index. Are there any tools that will quickly >> export >> the index to a text file or making queries 1000 at a time is the best >> option >> and dealing with the time it takes to query once you are deep into the >> index? >> >> -- >> Regards, >> >> Ian Connor >