Hi folks,

Does anyone have any bright ideas on how to benchmark solr?  Unless
someone has something better, here is what I am thinking:

1. Have a config file where one can specify info like how many docs, how
large, how many facets, and how many updates / searches per minute

2. Use one of the various client APIs to generate XML files for updates
using some kind of lorem ipsum text as a base and store them in a dir.

3. Use siege to set the update run at whatever interval is specified in
the config, sending an update every x seconds and removing it from the
directory

4. Generate a list of search queries based upon the facets created, and
build a urls.txt with all of these search urls

5. Run the searches through siege

6. Monitor the output using nagios to see where load kicks in.

This is not that sophisticated, and feels like it won't really pinpoint
bottlenecks, but would aproximately tell us where a server will start to
bail.

Does anyone have any better ideas?

Best,
Jacob Singh

Reply via email to