Hi,
I did some trivial Tests with Jmeter.
I set up Jmeter to increase the number of threads steadily.
For requests I either usa a random word or combination of words in a
wordlist or some sample date from the test system. (this is described in the
JMeter manual)

In my case the System works fine as long as I don't exceed the max number of
requests per second it can handel. But thats not a big surprise. More
interesting seems the fact, that to a certain degree, after exceeding the
max nr of requests response time seems to rise linear for a little while and
then exponentially. But that might also be the result of my test szenario.

Nico


> -----Original Message-----
> From: Jacob Singh [mailto:[EMAIL PROTECTED]
> Sent: Sunday, June 29, 2008 6:04 PM
> To: solr-user@lucene.apache.org
> Subject: Benchmarking tools?
>
> Hi folks,
>
> Does anyone have any bright ideas on how to benchmark solr?
> Unless someone has something better, here is what I am thinking:
>
> 1. Have a config file where one can specify info like how
> many docs, how large, how many facets, and how many updates /
> searches per minute
>
> 2. Use one of the various client APIs to generate XML files
> for updates using some kind of lorem ipsum text as a base and
> store them in a dir.
>
> 3. Use siege to set the update run at whatever interval is
> specified in the config, sending an update every x seconds
> and removing it from the directory
>
> 4. Generate a list of search queries based upon the facets
> created, and build a urls.txt with all of these search urls
>
> 5. Run the searches through siege
>
> 6. Monitor the output using nagios to see where load kicks in.
>
> This is not that sophisticated, and feels like it won't
> really pinpoint bottlenecks, but would aproximately tell us
> where a server will start to bail.
>
> Does anyone have any better ideas?
>
> Best,
> Jacob Singh
>


Reply via email to