On Tue, 2018-07-31 at 11:12 +0200, Fette, Georg wrote:
> I agree that receiving too much data in one request is bad. But I
> was surprised that the query works with a lower but still very large
> rows parameter and that there is a threshold at which it crashes the
> server.
> Furthermore, it seems
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Georg,
On 7/31/18 12:33 PM, Georg Fette wrote:
> Yes ist is only one of the processors that is at maximum capacity.
Ok.
> How do I do something like a thread-dump of a single thread ?
Here's how to get a thread dump of the whole JVM:
https://wiki
Hi Christoph,
Yes ist is only one of the processors that is at maximum capacity.
How do I do something like a thread-dump of a single thread ? We run the
Solr from the command line out-of-the-box and not in a code development
environment. Are there parameters that can be configured so that the
ser
On 7/31/2018 2:39 AM, Georg Fette wrote:
We run the server version 7.3.1. on a machine with 32GB RAM in a mode
having -10g.
When requesting a query with
q={!boost
b=sv_int_catalog_count_document}string_catalog_aliases:(*2*)&fq=string_field_type:catalog_entry&rows=2147483647
the server takes
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256
Georg,
On 7/31/18 4:39 AM, Georg Fette wrote:
> We run the server version 7.3.1. on a machine with 32GB RAM in a
> mode having -10g.
>
> When requesting a query with
>
> q={!boost
> b=sv_int_catalog_count_document}string_catalog_aliases:(*2*)&fq=
Yes, but 581 is the final number you got in the response, which is the
result of the main query intersected with the filter query so I wouldn't
take in account this number. The main and the filter query are executed
separately, so I guess (but I'm guessing because I don't know these
internals)
Hi Andrea,
I agree that receiving too much data in one request is bad. But I was
surprised that the query works with a lower but still very large rows
parameter and that there is a threshold at which it crashes the server.
Furthermore, it seems that the reason for the crash is not the size of
We run the server version 7.3.1. on a machine with 32GB RAM in a mode
having -10g.
When requesting a query with
q={!boost
b=sv_int_catalog_count_document}string_catalog_aliases:(*2*)&fq=string_field_type:catalog_entry&rows=2147483647
the server takes all available memory up to 10GB and is th
Hi Georg,
I would say, without knowing your context, that this is not what Solr is
supposed to do. You're asking to load everything in a single
request/response and this poses a problem.
Since I guess that, even we assume it works, you should then iterate
those results one by one or in blocks,
solr-user@lucene.apache.org
> Subject: Solr Server crashes when requesting a result with too large
> resultRows
>
> Hello,
> We run the server version 7.3.1. on a machine with 32GB RAM in a mode
> having -10g.
> When requesting a query with
> q={!boost
> b=sv_int_catalog_c
Hello,
We run the server version 7.3.1. on a machine with 32GB RAM in a mode
having -10g.
When requesting a query with
q={!boost
b=sv_int_catalog_count_document}string_catalog_aliases:(*2*)&fq=string_field_type:catalog_entry&rows=2147483647
the server takes all available memory up to 10GB and i
11 matches
Mail list logo