I have been using Jmeter to perform some load testing. In your case you might 
like to take a look at 
http://jakarta.apache.org/jmeter/usermanual/component_reference.html#CSV_Data_Set_Config
 . This will allow you to use a random item from your query list.

Regards,
Kallin Nagelberg

-----Original Message-----
From: Blargy [mailto:zman...@hotmail.com] 
Sent: Friday, April 09, 2010 9:47 PM
To: solr-user@lucene.apache.org
Subject: Benchmarking Solr


I am about to deploy Solr into our production environment and I would like to
do some benchmarking to determine how many slaves I will need to set up.
Currently the only way I know how to benchmark is to use Apache Benchmark
but I would like to be able to send random requests to the Solr... not just
one request over and over.

I have a sample data set of 5000 user entered queries and I would like to be
able to use AB to benchmark against all these random queries. Is this
possible?

FYI our current index is ~1.5 gigs with ~5m documents and we will be using
faceting quite extensively. Are average requests per/day is ~2m. We will be
running RHEL with about 8-12g ram. Any idea how many slaves might be
required to handle our load?

Thanks
-- 
View this message in context: 
http://n3.nabble.com/Benchmarking-Solr-tp709561p709561.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to