Hi Georg,
I would say, without knowing your context, that this is not what Solr is
supposed to do. You're asking to load everything in a single
request/response and this poses a problem.
Since I guess that, even we assume it works, you should then iterate
those results one by one or in blocks, an option would be to do this
part (block scrolling) using Solr [2].
I suggest you to have a look at
* the export endpoint [1]
* the cursor API [2]
Best,
Andrea
[1] https://lucene.apache.org/solr/guide/6_6/exporting-result-sets.html
[2]
https://lucene.apache.org/solr/guide/6_6/pagination-of-results.html#fetching-a-large-number-of-sorted-results-cursors
On 31/07/18 10:44, Georg Fette wrote:
Hello,
We run the server version 7.3.1. on a machine with 32GB RAM in a mode
having -10g.
When requesting a query with
q={!boost
b=sv_int_catalog_count_document}string_catalog_aliases:(*2*)&fq=string_field_type:catalog_entry&rows=2147483647
the server takes all available memory up to 10GB and is then no longer
accessible with one processor at 100%.
When we reduce the rows parameter to 10000000 the query works. The
query returns only 581 results.
The documentation at
https://wiki.apache.org/solr/CommonQueryParameters states that as the
"rows" parameter a "ridiculously large value" may be used, but this
could pose a problem. The number we used was Int.max from Java.
Greetings
Georg