We recently plan to replace a old-school lucene that has 50M docs with 
Solrcloud but the daily update, according to the responsive colleague,  could 
be around 100 thousands docs. Its data source is a bunch of mysql tables. When 
implementing the updating workflow, what solud I do so that I can maintain a 
fair amount of time when doing updating docs? Currently what I have in mind are:

1. Use atomic update to avoid unnecessary full-doc update.
2. Run multple of my updating process where each update different range of docs.

Is there other things that I can do to help my issue? Is there any suggestion 
or expereiences for preparing appropriate h/w, e.g. CPU or RAM?

scott.chu,scott....@udngroup.com
2016/6/7 (週二)

Reply via email to