On 5/14/2020 3:14 PM, matthew sporleder wrote:> Can a non-nested entity
write into existing docs, or do they always> have to produce
document-per-entity?
This is the only thing I found on this topic, and it is on a third-party
website, so I can't say much about how accurate it is:
https://stac
On Thu, May 14, 2020 at 4:46 PM Shawn Heisey wrote:
>
> On 5/14/2020 9:36 AM, matthew sporleder wrote:
> > It appears that adding entities to my entities in my data import
> > config is slowing down my import process by a lot. Is there a good
> > way to speed this up? I see the ID's are individu
On 5/14/2020 9:36 AM, matthew sporleder wrote:
It appears that adding entities to my entities in my data import
config is slowing down my import process by a lot. Is there a good
way to speed this up? I see the ID's are individually queried instead
of using IN() or similar normal techniques to
It appears that adding entities to my entities in my data import
config is slowing down my import process by a lot. Is there a good
way to speed this up? I see the ID's are individually queried instead
of using IN() or similar normal techniques to make things faster.
Just looking for some tips.
Message-
> > From: Mikhail Khludnev [mailto:m...@apache.org]
> > Sent: Monday, September 02, 2019 12:23 PM
> > To: Vadim Ivanov; solr-user
> > Subject: Re: Idle Timeout while DIH indexing and implicit sharding in 7.4
> >
> > It seems like reasonable behavior. Solr
ccept records from DIH. Am I wrong?
--
Vadim
> -Original Message-
> From: Mikhail Khludnev [mailto:m...@apache.org]
> Sent: Monday, September 02, 2019 12:23 PM
> To: Vadim Ivanov; solr-user
> Subject: Re: Idle Timeout while DIH indexing and implicit sharding in 7.4
>
>
@apache.org]
> *Sent:* Monday, September 02, 2019 1:31 AM
> *To:* solr-user
> *Cc:* vadim.iva...@spb.ntk-intourist.ru
> *Subject:* Re: Idle Timeout while DIH indexing and implicit sharding in
> 7.4
>
>
>
> Giving that
>
> org.apache.solr.common.util.FastInputStream.peek(
Giving that
org.apache.solr.common.util.FastInputStream.peek(FastInputStream.java:60)
at
org.apache.solr.handler.loader.JavabinLoader.parseAndLoadDoc
s(JavabinLoader.java:107)
JavabinLoader hangs on Stream.peek(), awaiting -1, and hit timeout. I guess
it's might be related with "closing so
I am facing same exact issue. We never had any issue with 6.5.1 when doing
full index (initial bulk load)
After upgrading to 7.5.0, getting below exception and indexing is taking a
very long time
2019-09-01 10:11:27.436 ERROR (qtp1650813924-22) [c:c_collection s:shard1
r:core_node3 x:c_collection_
I am facing same exact issue. We never had any issue with 6.5.1 when doing
full index (initial bulk load)
After upgrading to 7.5.0, getting below exception and indexing is taking a
very long time
2019-09-01 10:11:27.436 ERROR (qtp1650813924-22) [c:c_member_lots_a s:shard1
r:core_node3 x:c_collecti
t: Friday, September 14, 2018 12:10 PM
To: solr-user
Subject: Re: Idle Timeout while DIH indexing and implicit sharding in 7.4
Hello, Vadim.
My guess (and only guess) that bunch of updates coming into a shard causes
a heavy merge that blocks new updates in its' order. This can be veri
utes with uneven shard distribution of documents.
> Any suggestion how to mitigate issue?
> --
> BR
> Vadim Ivanov
>
>
> -Original Message-
> From: Вадим Иванов [mailto:vadim.iva...@spb.ntk-intourist.ru]
> Sent: Wednesday, September 12, 2018 4:29 PM
> To: solr-user@
with uneven shard distribution of documents.
Any suggestion how to mitigate issue?
--
BR
Vadim Ivanov
-Original Message-
From: Вадим Иванов [mailto:vadim.iva...@spb.ntk-intourist.ru]
Sent: Wednesday, September 12, 2018 4:29 PM
To: solr-user@lucene.apache.org
Subject: Idle Timeout whil
Hello gurus,
I am using solrCloud with DIH for indexing my data.
Testing 7.4.0 with implicitly sharded collection I have noticed that any
indexing
longer then 2 minutes always failing with many timeout records in log coming
from all replicas in collection.
Such as:
x:Mycol_s_0_replica_t40 Reque
Hi All,
I am using mysql dataimport handler to index documents from database. I
have stored news articles in database. I have done following changes in
dataimport
data-import.xml
Added following xml in data import.xml file
-lots-of-autowarming-messages-in-log-during-DIH-indexing-tp4064649p4067477.html
Sent from the Solr - User mailing list archive at Nabble.com.
.472066.n3.nabble.com/seeing-lots-of-autowarming-messages-in-log-during-DIH-indexing-tp4064649p4064768.html
Sent from the Solr - User mailing list archive at Nabble.com.
st run another
commit with openSearcher = true , once your indexing process finishes.
--
View this message in context:
http://lucene.472066.n3.nabble.com/seeing-lots-of-autowarming-messages-in-log-during-DIH-indexing-tp4064649p4064768.html
Sent from the Solr - User mailing list archive at Nabble.com.
needs to be done _AFTER_ the DIH finishes (if anything)?
eg, does this need to be turned back on after the DIH has finished?
--
View this message in context:
http://lucene.472066.n3.nabble.com/seeing-lots-of-autowarming-messages-in-log-during-DIH-indexing-tp4064649p4064695.html
Sent from the
lt for Searcher@5b8d745 main
> documentCache{lookups=0,hits=0,hitratio=0.00,inserts=0,evictions=0,size=0,warmupTime=0,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
>
> thx
> mark
>
>
>
>
> --
> View this message i
,cumulative_lookups=0,cumulative_hits=0,cumulative_hitratio=0.00,cumulative_inserts=0,cumulative_evictions=0}
thx
mark
--
View this message in context:
http://lucene.472066.n3.nabble.com/seeing-lots-of-autowarming-messages-in-log-during-DIH-indexing-tp4064649.html
Sent from the Solr - User mailing
Sorry about bringing an old thread back, I thought my solution could be
useful.
I also had to deal with multiple data sources. If the data source number
could be queried for in one of your parent entities then you could get it
using a variable as follows:
http://lucene.472066.n3.nabble.com/DIH
cumbersome).
So, my data-config.xml looks like
Do I have to create entities for each data source, even though they contain
the same queries and operate on the same schema? I know the following is
not possible:
http://lucene.472066.n3.nabble.com/DIH-Indexing-multiple-dataso
My error was in syntax. Sorry for the spam.
-Original Message-
From: Mukerjee, Neiloy (Neil) [mailto:neil.muker...@alcatel-lucent.com]
Sent: Wednesday, June 24, 2009 4:33 PM
To: solr-user@lucene.apache.org
Subject: DIH Indexing
I am running Solr 1.3 with Tomcat 6, and I am trying to
I am running Solr 1.3 with Tomcat 6, and I am trying to import and index a
MySQL database. However, even though DIH seems to have installed properly, when
I try to do a full import via
http://localhost:8080/solr/dataimport?command=full-import, the following is
registered in my logs.
*
25 matches
Mail list logo