Jason,
Not yet.This issue was on the back burner for a few daysHowever we
still need to figure out what could be a potential solution to it.
The setup is basic one - with one node / no shards or replicas
2 cores
When I run the query adding debug=timing to raw query parameters it just
hangs in
Hey Abhijit,
The information you provided isn't really enough for anyone else on
the mailing list to debug the problem. If you'd like help, please
provide some more information.
Good places to start would be: what is the query, what does Solr tell
you when you add a "debug=timing" parameter to y
Hello,
I am seeing a performance issue in querying in one of the SOLR servers -
instance version 5.4.1.
Total number of documents indexed are 20K plus.
Data returned for this particular query is just as less as 22 documents
however it takes almost 2 minutes to get the results back.
Is there a way
On 5/11/2019 2:06 PM, Abhijit Pawar wrote:
"ISODate("2019-03-12T21:53:16.841Z”)” saves the date in mongoDB as*
2019-05-09 21:53:16.841Z* which is passed to SOLR while indexing.
It then throws below error:
*java.text.ParseException: Unparseable date: "Tue Mar 12 21:53:16 UTC 2019"*
If that e
ow to get it to
> spit out the date in the correct format, but that’s where you need to look.
> >
> > Best,
> > Erick
> >
> >> On May 10, 2019, at 2:13 PM, Abhijit Pawar
> wrote:
> >>
> >> Hello,
> >>
> >> I am trying to index date in
Best,
> Erick
>
>> On May 10, 2019, at 2:13 PM, Abhijit Pawar wrote:
>>
>> Hello,
>>
>> I am trying to index date in ISODate format like this saved in mongoDB
>> collection using DataImortHandler in SOLR 5.4.1:
>> {
>> .
gt; Hello,
>
> I am trying to index date in ISODate format like this saved in mongoDB
> collection using DataImortHandler in SOLR 5.4.1:
> {
> .
> .
> "endDate" : ISODate("2019-03-12T21:53:16.841Z")
> }
>
> It throws below error:
> *java.text.P
Hello,
I am trying to index date in ISODate format like this saved in mongoDB
collection using DataImortHandler in SOLR 5.4.1:
{
.
.
"endDate" : ISODate("2019-03-12T21:53:16.841Z")
}
It throws below error:
*java.text.ParseException: Unparseable date: "Tue Mar 12 21:5
have.
>
> Cheers
>
> Felix
>
>
>
> 2018-04-18 17:11 GMT+02:00 Emir Arnautović :
>
> > Hi Felix,
> > Did you try to do thread dump while doing update. Did it show anything?
> >
> > Emir
> > --
> > Monitoring - Log Management - Alerting - A
port Training - http://sematext.com/
>
>
>
> > On 18 Apr 2018, at 17:06, Felix XY wrote:
> >
> > Hello group,
> >
> > since two days we have huge problems with our solr 5.4.1 installation.
> >
> > ( yes, we have to update it. But this will not be a solut
p,
>
> since two days we have huge problems with our solr 5.4.1 installation.
>
> ( yes, we have to update it. But this will not be a solution right now )
>
> All path=/select requests are still very fast. But all /update Requests
> take >30sec up to 3 minutes.
>
Hello group,
since two days we have huge problems with our solr 5.4.1 installation.
( yes, we have to update it. But this will not be a solution right now )
All path=/select requests are still very fast. But all /update Requests
take >30sec up to 3 minutes.
The index is not very big (1.000.
OK just restarting all the solr nodes did fix it, since they are in production
I was hesitant to do that
From: Petersen, Robert (Contr)
Sent: Monday, January 8, 2018 12:34:28 PM
To: solr-user@lucene.apache.org
Subject: solr 5.4.1 leader issue
Hi got two out of
st restart the solr
instances, the zookeeper instances, both, or is there another better way
without restarting everything?
Thx
Robi
From: Petersen, Robert (Contr)
Sent: Monday, January 8, 2018 12:34:28 PM
To: solr-user@lucene.apache.org
Subject: solr 5.4.1 le
I'm on zookeeper 3.4.8
From: Petersen, Robert (Contr)
Sent: Monday, January 8, 2018 12:34:28 PM
To: solr-user@lucene.apache.org
Subject: solr 5.4.1 leader issue
Hi got two out of my three servers think they are replicas on one shard getting
exceptions wond
Hi got two out of my three servers think they are replicas on one shard getting
exceptions wondering what is the easiest way to fix this? Can I just restart
zookeeper across the servers? Here are the exceptions:
TY
Robi
ERROR
null
RecoveryStrategy
Error while trying to recover.
core=custsea
> >> > at org.apache.solr.schema.FieldTypePluginLoader$3.
> >> create(FieldTypePluginLoader.java:383)
> >> > at org.apache.solr.schema.FieldTypePluginLoader$3.
> >> create(FieldTypePluginLoader.java:377)
> >> > at org.apache.s
; >
>> >
>> > Please help me its very urgent to build a custom tokenizer like
>> > StandardTokenizerFactory where i will write my own rules for indexing.
>> >
>> >
>> >
>> >
>> >
>> >
t;
> > On Wed, Nov 8, 2017 at 4:30 AM, Erick Erickson
> > wrote:
> >
> >> Looks to me like you're compiling against the jars from one version of
> >> Solr and executing against another.
> >>
> >> /root/solr-5.2.1/server/solr/#/conf/m
;
>
>
>
> On Wed, Nov 8, 2017 at 4:30 AM, Erick Erickson
> wrote:
>
>> Looks to me like you're compiling against the jars from one version of
>> Solr and executing against another.
>>
>> /root/solr-5.2.1/server/solr/#####/conf/managed-schema
>>
>
ing against another.
>
> /root/solr-5.2.1/server/solr/#/conf/managed-schema
>
> yet you claim to be using 5.4.1
>
> On Tue, Nov 7, 2017 at 12:00 PM, kumar gaurav wrote:
> > Hi
> >
> > I am developing my own custom filter in solr 5.4.1.
> >
>
custom filter in solr 5.4.1.
>
> I have created a jar of a filter class with extend to TokenizerFactory
> class .
>
> When i loaded in to sol config and add my filter to managed-schema , i
> found following error -
>
> org.apache.solr.common.SolrException: Could not load con
Hi
I am developing my own custom filter in solr 5.4.1.
I have created a jar of a filter class with extend to TokenizerFactory
class .
When i loaded in to sol config and add my filter to managed-schema , i
found following error -
org.apache.solr.common.SolrException: Could not load conf for
Hello
I have 14 cores, with a couple of them using Shards and now I am looking at
the master/Slave fallback solution. Can anyone please point me in the right
direction to get started?
Thanks
Kalpana
--
View this message in context:
http://lucene.472066.n3.nabble.com/Solr-5-4-1-Master-Slave-Re
It's unclear what you're asking. You want your own
schema file? Or your own configuration for parsing
your documents?
Have you read through the reference guide
section here:
https://cwiki.apache.org/confluence/display/solr/Uploading+Data+with+Solr+Cell+using+Apache+Tika
and if so, what parts are
I am looking for to define multi field for example the field links to
extract all links from the field text of each file.
I define in tika.config.xml a regex for the expression of links but when
the prossesor of indexation is finish I get just one value even if in
schema.xml I define the field link
hi i would like to create a new field structure(tika-config.xml) for my
indexing files using tika (ExtractingRequestHandler) and i just want a
working example to follow so that i can create my file thank you
As per Shawn's advice I deleted the index data using
http://localhost:8983/solr/Sitecore_SharePoint/update?stream.body=%3Cdelete%3E%3Cquery%3E*:*%3C/query%3E%3C/delete%3E&commit=true
and then stopped and started Solr and the duplicates were gone.
Will keep a watch!
Thanks much!
--
View this
Url
merge and still saw duplicates. I can try what you have recommended.
Thanks so much!.
From: Shawn Heisey-2 [via Lucene]
[mailto:ml-node+s472066n4275813...@n3.nabble.com]
Sent: Tuesday, May 10, 2016 10:38 AM
To: Kalpana Sivanandan
Subject: Re: Solr 5.4.1 Mergeindexes duplicate rows
On 5/9
On 5/9/2016 7:55 AM, Kalpana wrote:
> Can anyone help me with a merge. Currently I have the two cores already
> pulling data from SQL Table based on the query I set up.
>
> Solr is running
>
> I also have a third core set up with schema similar to the first two. and
> then I wrote this in the url a
Hello
Can anyone help me with a merge. Currently I have the two cores already
pulling data from SQL Table based on the query I set up.
Solr is running
I also have a third core set up with schema similar to the first two. and
then I wrote this in the url and hit enter
http://localhost:8983/solr/
Querying on _uniqueKey:9105 returns only one doc from Core1 and 0 from Core2
before the merge
--
View this message in context:
http://lucene.472066.n3.nabble.com/Solr-5-4-1-Mergeindexes-duplicate-rows-tp4275153p4275174.html
Sent from the Solr - User mailing list archive at Nabble.com.
Yes, when I query them separately I do not see duplicates. I am using Solr
5.4.1 I created the core and then browsed to
http://localhost:8983/solr/admin/cores?action=mergeindexes&core=Sitecore_SharePoint&srcCore=sitecore_web_index&srcCore=SharePoint_All
Thanks
--
View this messag
My _guess_ is that you somehow hit the merge multiple times and,
perhaps, interrupted it thus don't have complete duplicates.
If we're all talking about the same thing, what you're seeing doesn't
make sense. I'm assuming you're totally sure that a query on
_uniqueKey:9105
will return only one do
Thank you for your reply, I did see the website (reason to use the merge
indexes). However, individual cores do not have duplicates and the two cores
dont have common records. So I am not sure why there are duplicates.
One of them is a sitecore core and the other one is a SQL db. They both have
d
On 5/6/2016 9:47 AM, Kalpana wrote:
> I am trying to create a new core by merging two indexes. All of them have
> the same schema and data on the cores do not have duplicates. As soon as I
> do a merge I see lots of duplicates. I used this for merging :
> http://localhost:8983/solr/admin/cores?acti
Hello
I am trying to create a new core by merging two indexes. All of them have
the same schema and data on the cores do not have duplicates. As soon as I
do a merge I see lots of duplicates. I used this for merging :
http://localhost:8983/solr/admin/cores?action=mergeindexes&core=Sitecore_SharePo
> >
> > make
> > {!func}sum(product(0.01,param1),
> > product(0.20,param2), min(param2,0.4)) desc
> >
> >
> >
> >
> > This works great in Solr 4.10. However,
lrconfig.xml:
>
>
>
>
>
> parts
> score desc, Review1 asc, Rank2 asc
>
>
> make
> {!func}sum(product(0.01,param1),
> product(0.20,param2), min(param2,0.4)) desc
>
>
>
>
> This wor
This works great in Solr 4.10. However, in solr 5.4.1 and solr 5.5.0, I get
the below error. How do I write this kind of query with Solr 5?
Thanks,
Max.
ERROR org.apache.solr.handler.RequestHandlerBase [ x:productsearch] –
org.apache.solr.common.SolrException: Can't determ
some document have content can not be extracted and stack in JVM of solr ;
i get this ERROR:
24/03/2016 à 19:26:59 ERROR null DocBuilder Exception while processing:
files document : null:org.apache.solr.handler.
dataimport.DataImportHandlerException: Unable to read content Processing
Document # 1
I have migrated my app that used tomee plus 1.6.0.1, Solr (war) 4.7.2, and
Nutch 1.8 to Solr 5.4.1 (w/ jetty),
Nutch 1.11, and Solrj on openSUSE 13.1.
With Solr 5.4.1 I can happily:
- add static content
- add servlets (java, clojure)
- import crawl data via Nutch 1.11, to a single core solr
23 January 2016, Apache Solr™ 5.4.1 available
The Lucene PMC is pleased to announce the release of Apache Solr 5.4.1
Solr is the popular, blazing fast, open source NoSQL search platform
from the Apache Lucene project. Its major features include powerful
full-text search, hit highlighting
43 matches
Mail list logo