Hi,

I'm doing a stress test on solr.
I've around 8,5M of doc, the size of my data's directory is 5,6G.

I've  indexed again my data to make it faster, and applied all the last
patch.
My index data store just two field : id and text (which is a copy of three
fiels)
But I still think it's very long, what do you think?

For 50request/sec during 40mn, my average  respond time : 1235msec.
49430request.

When I make this test with 100request second during 10mn and 10 other
minutes with 50 request : my average respond time is 1600msec. Don't you
think it's a bit long.

Should I partition this index more ? or what should I do to make this work
faster.
I can read post with people who have just 300msec request for 300Go of index
partionned ? 
My request for collecting all this book is quite complex and have lot of
table linked together, maybe it would be faster if I create a csv file ? 

The server that I'm using for the test has 8G of memory.
4CPU : Intel(R) Xeon(R) CPU            5160  @ 3.00GHz
Tomcat55 : -Xms2000m -Xmx4000m 
Solr 1.3.

What can I modify to make it more performant ? memory, indexation ...?
Does it can come from my request to the mysql database which is too much
linked ? 

Thanks a lot for your help,
Johanna


-- 
View this message in context: 
http://www.nabble.com/Make-it-more-performant---solr-1.3---1200msec-respond-time.-tp20953079p20953079.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to