On 5/5/2016 11:42 PM, Bastien Latard - MDPI AG wrote:
> So if I run the two following requests, it will only store once 7.5Mo,
> right?
> - select?q=*:*&fq=bPublic:true&rows=10
> - select?q=field:my_search&fq=bPublic:true&rows=10
That is correct.
Thanks,
Shawn
Thank you Shawn!
So if I run the two following requests, it will only store once 7.5Mo,
right?
- select?q=*:*&fq=bPublic:true&rows=10
- select?q=field:my_search&fq=bPublic:true&rows=10
kr,
Bast
On 04/05/2016 16:22, Shawn Heisey wrote:
On 5/3/2016 11:58 PM, Bastien Latard - MDPI AG wrote:
T
: You could, but before that I'd try to see what's using your memory and see
: if you can decrease that. Maybe identify why you are running OOM now and
: not with your previous Solr version (assuming you weren't, and that you are
: running with the same JVM settings). A bigger heap usually means m
On 5/3/2016 11:58 PM, Bastien Latard - MDPI AG wrote:
> Thank you for your email.
> You said "have big caches or request big pages (e.g. 100k docs)"...
> Does a fq cache all the potential results, or only the ones the query
> returns?
> e.g.: select?q=*:*&fq=bPublic:true&rows=10
>
> => with this qu
Hi Tomás,
Thank you for your email.
You said "have big caches or request big pages (e.g. 100k docs)"...
Does a fq cache all the potential results, or only the ones the query
returns?
e.g.: select?q=*:*&fq=bPublic:true&rows=10
=> with this query, if I have 60 millions of public documents, would
You could use some memory analyzer tools (e.g. jmap), that could give you a
hint. But if you are migrating, I'd start to see if you changed something
from the previous version, including jvm settings, schema/solrconfig.
If nothing is different, I'd try to identify which feature is consuming
more me
Hi Tomás,
Thanks for your answer.
How could I see what's using memory?
I tried to add "-XX:+HeapDumpOnOutOfMemoryError
-XX:HeapDumpPath=/var/solr/logs/OOM_Heap_dump/"
...but this doesn't seem to be really helpful...
Kind regards,
Bastien
On 02/05/2016 22:55, Tomás Fernández Löbbe wrote:
You
You could, but before that I'd try to see what's using your memory and see
if you can decrease that. Maybe identify why you are running OOM now and
not with your previous Solr version (assuming you weren't, and that you are
running with the same JVM settings). A bigger heap usually means more work