Hello Tim,
Since 5.3 you can try to add {!join ... score=none} local parameter, it
switches the algorithm to Lucene's JoinUtil which complexity depends from
number of docs under "from" side.
On Sun, Aug 14, 2016 at 10:13 PM, Tim Frey wrote:
> Hi there. I'm trying to fix a performance problem I
Hi Mahmoud,
I haven't been looking for new DIH featrures, but I don't think there is
something that can provides such functionality and that only thing you
can do is track it in your source and index it (like createDate and
lastUpdatedDate).
Regards,
Emir
On 14.08.2016 20:56, Mahmoud Almok
Thank you Shalin. I've created SOLR-9414
KATHERINE MORA
Senior Engineer
-Original Message-
From: Shalin Shekhar Mangar [mailto:shalinman...@gmail.com]
Sent: Saturday, August 13, 2016 7:22 PM
To: solr-user@lucene.apache.org
Subject: Re: ConcurrentModificationException due to high volume o
I currently doing a POC with SOLR 6 on my windows 7, with 16GB ram.
Successfully imported 16 million of documents from SQL Server, where one of
the SQL column is an XML.
Whenever I query on the XML datatype (In solr, it's a text field), I keep
getting SocketTimeoutException if the /select query goe
Hello,
Solr is trying to process non-existing child/nested entities. By
non-existing I mean that they exist in DB but should not be at Solr side
because they don't match the conditions in the query I use to fetch them.
I have the below solr data configuration. The relationship between tables
is c
Solr (well DIH) just passes that query to the DB, so if you are
getting extra rows (not extra fields), than I would focus on the
database side of the situation.
Specifically, I would confirm from the database logs what the sent
query actually looks like.
Very specifically, in your very first enti
I think it is under solr_path/server/etc/ a file name jetty.xml
2016-08-15 10:01 GMT-03:00 Stan Lee :
> I currently doing a POC with SOLR 6 on my windows 7, with 16GB ram.
> Successfully imported 16 million of documents from SQL Server, where one of
> the SQL column is an XML.
> Whenever I
Thanks for the promp reply.
h.enabled=true is a typo. It should be c.enabled=true, because the table
companies also has a column called enabled. That part is working fine (it
doesn't fetch companies with enabled=false).
About the DB queries, I've taken, by turning Debug and Verbose on in the
Data
Hmm. I would still take as truth the database logs as opposed to Solr
logs. Or at least network traces using something like Wireshark.
Otherwise, you need some way to reduce your DIH query to the minimum
reproducible example. I am used to reading tech support emails and
even then I am not sure I c
I'm very sorry, but you're right. Using one of the queries from the query
log, I get a 1 row(s) returned. So it itsn't a Solr issue.
Thanks a lot Alexandre.
2016-08-15 16:17 GMT+02:00 Alexandre Rafalovitch :
> Hmm. I would still take as truth the database logs as opposed to Solr
> logs. Or at le
I have an AnalyticsQuery that takes several params computed at runtime
because they are dynamic. There is a MergeStrategy that needs to combine
stats data and merge doc Ids. There is a DocTransformer that injects some
stats into each returned doc. I cannot seem to get all the pieces to work
togeth
The formatting in my question was wiped, reposting
I have an AnalyticsQuery that takes several params computed at runtime
because they are dynamic. There is a MergeStrategy that needs to combine
stats data and merge doc Ids. There is a DocTransformer that injects some
stats into each returned
Hello Jennifer,
The spatial documentation is largely this page:
https://cwiki.apache.org/confluence/display/solr/Spatial+Search
(however note the online version is always for the latest Solr release. You
can download a PDF versioned against your Solr version).
To do polygon searches, you both nee
Hi,
a.) Yes index is static, not updated live. We index new documents over
old documents by this sequesce, deleteall docs, add 10 freshly fetched
from db, after adding all the docs to cloud instance, commit. Commit
happens only once per collection,
b.) I took one shard and below are the result
14 matches
Mail list logo