I've got a very simple perl script (most of the work is done with
modules) that I wrote which forks off multiple processes and throws
requests at Solr, then gives a little bit of statistical analysis at the
end. I have planned on sharing it from the beginning, I just have to
clean it up for public consumption. I will try to do that today, though
I don't know if I can. Anyone got a recommendation about where to put
it on the wiki?
Shawn
On 4/9/2010 7:46 PM, Blargy wrote:
I am about to deploy Solr into our production environment and I would like to
do some benchmarking to determine how many slaves I will need to set up.
Currently the only way I know how to benchmark is to use Apache Benchmark
but I would like to be able to send random requests to the Solr... not just
one request over and over.
I have a sample data set of 5000 user entered queries and I would like to be
able to use AB to benchmark against all these random queries. Is this
possible?
FYI our current index is ~1.5 gigs with ~5m documents and we will be using
faceting quite extensively. Are average requests per/day is ~2m. We will be
running RHEL with about 8-12g ram. Any idea how many slaves might be
required to handle our load?
Thanks