Hi Upayavira,
         I'm working with solr 4.0, sorting on score (default).
I tried setting the document cache size to 2048, so all docs of a single
request fit (2 requests fit actually)
If I execute a query the first time it takes 24s
I reexecute it, with all docs in the documentCache and it takes 15s
execute it with rows = 400 and it takes 3s

it seems that below rows = 400 times are acceptable, beyond they get slow

2016-02-11 11:27 GMT+01:00 Upayavira <u...@odoko.co.uk>:

>
>
> On Thu, Feb 11, 2016, at 09:33 AM, Matteo Grolla wrote:
> > Hi,
> >      I'm trying to optimize a solr application.
> > The bottleneck are queries that request 1000 rows to solr.
> > Unfortunately the application can't be modified at the moment, can you
> > suggest me what could be done on the solr side to increase the
> > performance?
> > The bottleneck is just on fetching the results, the query executes very
> > fast.
> > I suggested caching .fdx and .fdt files on the file system cache.
> > Anything else?
>
> The index files will automatically be cached in the OS disk cache
> without any intervention, so that can't be the issue.
>
> How are you sorting the results? Are you letting it calculate scores?
> 1000 rows shouldn't be particularly expensive, beyond the unavoidable
> network cost.
>
> Have you considered using the /export endpoint and the streaming API? I
> haven't used it myself, but it is intended for getting larger amounts of
> data out of a Solr index.
>
> Upayavira
>

Reply via email to