Hi, I am posting my query as it is very much related to this one. I have an application which creates different dataset on each invocation. With dataset, I mean different columns/schema/metadata and different data ofcourse. The only fields common to the data generated are the primary key. I have looked into multiple forums but could not get on what is the best solution wrt single core _vs_ multiple core. As my application creates new set of fields and data every time, the schema is different on each run I can say. the numbers can go upto 100. If I use multi core, at a time at worst case I will be interested in around 20 cores.
My questions are: 1) What factors should I look into to decide what to opt ? Pros: a) I understand that search performance will be faster, giving a good user experience. b) Maintainability will be good as change in one core will not effect other. c) I can easily fetch schema for each run if I have multiple cores. With single core holding all the data, it is going to be tricky. Cons: a) We may have to move cores in and out of memory in case of contentions. This will result in more CPU utilization. 2) What is the indexing time taken for single core if its updated again and again ? Does the whole core gets effected in case I add more rows ? In multiple core model, I need to index the one core I am updating. 3) How much more memory index it takes when I plan to split the single core into multiple cores ? Please help me answer these questions. Thanks, Sumit -- View this message in context: http://lucene.472066.n3.nabble.com/Single-Core-or-Multiple-Core-tp501748p4004254.html Sent from the Solr - User mailing list archive at Nabble.com.