Hi,
We are currently implementing Solr cloud and as part of this effort we are
investigating, which failure modes may happen between Solr and Zookeeper.
We have found quite a lot articles describing the "happy path" failure, when ZK
stops (loses majority) and the Solr Cluster ceases to serve wri
Hi,
We are currently using Solr 4.2.1 version in our project and everything is
going well. But recently, we are facing an issue with Solr Data Import. It is
not importing the files with size greater than 32766 bytes (i.e, 32 kb) and
showing 2 exceptions:
1. java.lang.illegalargumentexcepti
First I want to thank you for your comments.
Second I'll add some background information.
Here Solr is part of a complex information management project, which I
developed for a customer and which includes different source
databases, containing edited/imported/crawled content.
This project run
In this case create a VPN and then access it.
> Am 02.01.2019 um 11:03 schrieb s...@cid.is:
>
> First I want to thank you for your comments.
> Second I'll add some background information.
>
> Here Solr is part of a complex information management project, which I
> developed for a customer and w
Hi Antony,
I don't know a ton about DIH, so I can't answer your question myself.
But you might have better luck getting an answer from others if you
include more information about the behavior you're curious about.
Where do you see this Last Modified timestamp (in the Solr admin UI?
on your filesy
On 01/01/2019 23:03, Lavanya Thirumalaisami wrote:
Hi,
I am trying to debug a query to find out why one documentgets more score than
the other. The below are two similar products.
You might take a look at OSC's Splainer http://splainer.io/ or some of
the other tools I've written about recen
Hi,
I don't know the limits about Solr 4.2.1 but the RefGuide of Solr 6.6
says about Field Types for Class StrField:
"String (UTF-8 encoded string or Unicode). Strings are intended for
small fields and are not tokenized or analyzed in any way.
They have a hard limit of slightly less than 32K."
If
On (2) these are BM25 parameters. There are several articles that discuss
BM25 in depth
https://opensourceconnections.com/blog/2015/10/16/bm25-the-next-generation-of-lucene-relevation/
https://www.elastic.co/blog/practical-bm25-part-2-the-bm25-algorithm-and-its-variables
On Tue, Jan 1, 2019
Hi Shawn,
Answers to your questions.
1.Yes we are aware of fault tolerance in our architecture,but its our dev
env,so we are working with solrCloud mode with limited machines.
2. Solr is running as separate app,its not on weblogic. We are using
Weblogic for rest services which further connect to
I typically resolve this sort of situation with a ssh proxy such as
ssh -f user@123.456.789.012 -L :127.0.0.1:8983 -N
Then I can access the solr GUI from localhost: on my machine, and all
the traffic is secured by SSH. Pick your local port ( here) as desired
of course. Sometimes I ha
In case of multiple "jumps", might I suggest the -J switch which allows you to
specify a jump host.
Kay
> On Jan 2, 2019, at 9:37 AM, Gus Heck wrote:
>
> I typically resolve this sort of situation with a ssh proxy such as
>
> ssh -f user@123.456.789.012 -L :127.0.0.1:8983 -N
>
> Then I
Adding to what Bernd said, _string_ fields that large are almost always
a result of misunderstanding the use case. Especially if you
find yourself searching with the q=field:*word* pattern.
If you're trying to search within the string you need a
TextField-based type, not a StrField.
Best,
Erick
1> no. At one point, this could be done in the sense that the
collections would be reconstructed, (legacyCloud) but that turned out
to have.. side effects. Even in that case, though, Solr couldn't
reconstruct the configsets. (insert rant that you really must store
your configsets in a VCS system so
Please follow the instructions here:
http://lucene.apache.org/solr/community.html#mailing-lists-irc. You
must use the _exact_ same e-mail as you used to subscribe.
If the initial try doesn't work and following the suggestions at the
"problems" link doesn't work for you, let us know. But note you n
If the keys line up nicely across jumps...
On Wed, Jan 2, 2019, 10:49 AM Kay Wrobel In case of multiple "jumps", might I suggest the -J switch which allows
> you to specify a jump host.
>
> Kay
>
> > On Jan 2, 2019, at 9:37 AM, Gus Heck wrote:
> >
> > I typically resolve this sort of situation w
I thought jar files for custom code were meant to go into the '.system'
collection, not zookeeper. Did I miss a new/old storage option?
On Wed, Jan 2, 2019, 12:25 PM Erick Erickson 1> no. At one point, this could be done in the sense that the
> collections would be reconstructed, (legacyCloud) bu
Is there any way I can debug the parser? Especially, the edismax parser which
does *not* raise any exception but produces an empty parsedQuery? Please, if
anyone can help. I feel very lost and without guidance, and Google search has
not provided me with any help at all.
> On Dec 28, 2018, at 9:
If you mean attach a debugger, solr is just like any other java program.
Pass in the standard java options at start up to have it listen or connect
as usual. The port is just a TCP port so ssh tunneling the debugger port
can bridge the gap with a remote machine (or a vpn).
That said the prior thre
Well, I was putting that info out there because I am literally hunting down
this issue without any guidance. The real problem for still is that the Edismax
Query Parser behaves abnormally starting with Version 5 until current giving me
empty parsedQuery. Forcing the request through the Lucene pa
On 12/28/2018 8:57 AM, Kay Wrobel wrote:
Here are my log entries:
SOLR 7.x (non-working)
2018-12-28 15:36:32.786 INFO (qtp1769193365-20) [ x:collection1] o.a.s.c.S.Request [collection1]
webapp=/solr path=/select
params={q=ac6023*&qf=tm_field_product^21.0&qf=tm_title_field^8.0&EchoParams=al
Thanks for your thoughts, Shawn. Are you a developer on SOLR?
Anyway, the configuration (solrconfig.xml) was provided by search_api_solr
(Drupal 7 module) and is untouched. You can find it here:
https://cgit.drupalcode.org/search_api_solr/tree/solr-conf/7.x/solrconfig.xml?h=7.x-1.x
Thank you for
Needless to say about handleError() and onSuccess() callbacks.
On Wed, Jan 2, 2019 at 6:14 AM deniz wrote:
> thanks a lot for the explanation :)
>
>
>
> -
> Zeki ama calismiyor... Calissa yapar...
> --
> Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
>
--
Sincerely y
We are still having serious problems with our solrcloud failing due to this
problem.
The problem is clearly data related.
How can I determine what documents are being searched? Is it possible to get
Solr/lucene to output the docids being searched?
I believe that this is a lucene bug, but I need
Are you able to re-index a subset into a new collection?
For control of timeouts I would suggest Postman or curl, or some other
non-browser client.
On Wed, Jan 2, 2019 at 2:55 PM Webster Homer <
webster.ho...@milliporesigma.com> wrote:
> We are still having serious problems with our solrcloud fa
Right, don't quite know what I was thinking about. Even so, if
ZooKeeper is gone you'd still have to rebuild the .system collection
too. Or at least figure out how to access it again.
On Wed, Jan 2, 2019 at 10:21 AM Gus Heck wrote:
>
> I thought jar files for custom code were meant to go into the
25 matches
Mail list logo