If a SOLR or Zookeeper host is replaced such that the hostname comes back but
it now resolves to a different IP, are SOLR nodes and SOLRJ clients expected
to just continue working?
I know that in this case replicas need to be re-created (and stale ones
deleted). But I'm wondering if SOLRJ clien
3-node SOLR 7.4.0
24gb max heap memory
13 collections, each with 500mb-2gb index (on disk)
We are investigating high heap memory usage/spikes with our SOLR cluster
(details above). After rebooting the cluster, all three instances stay
under 2gb for about a day. Then suddenly, one instance (srch0
Opened SOLR-13274
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
We are seeing the same issue running 7.4.0. Increasing the request and
response header size did not resolve the issue. Should we open a JIRA
ticket if one does not already exist?
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
Thanks, Shawn.
We made a change to add q.op=AND as a separate param and found a few issues.
For example, we have a query that filters out guest users in our product.
It boils down to:
select?q=myname*&q.op=AND&fq=(-(site_role:"Guest"))
debugQuery shows this is parsed as the following, which do
Thanks Shawn!
Based on what you said, is my query supposed to work as is if I set
luceneMatchVersion=7.1.0? It does not appear to.
Also, my understanding is using the local param makes the AND apply only to
the following search terms provided to the "q" query string. If I add a
q.op=AND as a se
SOLR 7.4.0
Apologies if this has been answered, but I can't find the answer for the
life of me if it has been.
Query:
/select?q={!q.op=AND}mysearchtext*&defType=edismax&debugQuery=true
Result:
"querystring": "{!q.op=AND}mysearchtext*",
"parsedquery": "+DisjunctionMaxQuery(((Synonym(text:q
text:q
SOLR 7.4.0
On Windows, solr.cmd creates solr-*.port files in SOLR_TIP\bin. It would be
nice if the location of these files could be configurable (SOLR_PID_DIR kind
of makes sense?)
Would this be worth a lower-priority bug?
--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
SOLR 7.4.0
We recently encountered an issue on Windows where the cleanup/uninstall
processes for our product were having difficulties deleting "hsperf"
directories created in SOLR's data directory. Somehow, these directories
ended up with permissions preventing anyone from deleting them. Further
i
SOLR 7.4.0
In the linux solr script, LOG4J_PROPS is expected to be the path to a custom
log4j2.xml file. LOG4J_CONFIG ends up being a collection of all
-Dlog4j.configurationFile=BLAH parameter strings.
In the windows solr.cmd script, LOG4J_PROPS is not respected at all. And
LOG4J_CONFIG is expe
Sorry, yes 10,000 ms.
We have a single test cluster (out of probably hundreds) where one node hits
this consistently. I'm not sure what kind of issues (network?) that node is
having.
Generally though, we ship SOLR as part of our product, and we cannot control
our customers' hardware and setup be
11 matches
Mail list logo