Hello Everyone,

If I have a large data set which needs to be indexed, what strategy I can
take to build the index fast?

1. split the input into multiple xml files and then open different shells
and post each of the split xml file? will this work and help me build index
faster than 1 large xml file?

2. What if I don't want to build the XML files at all. I want to write the
extraction logic in an ETL tool and then let the ETL tool send the command
to SOLR. then I run my ETL tool in a multi-threaded manner where each thread
is extracting the data from the backed and send it to Solr for indexing.

3. Use the Core Feature and then populate each core separately, then merge
the cores.

Any other approach?



-- 
View this message in context: 
http://old.nabble.com/Posting-Concurrently-to-Solr-tp27544311p27544311.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to