Hello,
1. It depends on your query types & data (complexity, featureset,
paging) - geospatial could be something with calculation inside solr?
2. It depends massively on the document size & field-selection (load a
hundred of 100MB documents can take some time)
3. It depends especially on your d
Are you requesting all 100K results in one request? If so, that is pretty fast.
If you are doing that, don't do that. Page the results.
wunder
On Jul 16, 2013, at 9:30 AM, Daniel Collins wrote:
> You only have a 20Gb collection but is that per machine or total
> collection, so 10Gb per machine?
You only have a 20Gb collection but is that per machine or total
collection, so 10Gb per machine? What memory do you have available on
those 2 machines, is it enough to get the collection into the disk cache?
What OS is it (linux/windows, etc)?
What heap size does your JVM have?
Is it a static co
Have you looked at cache utilization?
Have you checked the IO and CPU load to see what the bottlenecks are?
Are you sure things like your heap and servlet container threads are tuned?
After you look at those issues, I'd probably think about adding http
caching and more replicas.
Michael Della Bit