com/2015/03/04/solr-suggester/
Which version of Solr are you using?
Regards,
Edwin
On Tue, 6 Nov 2018 at 17:00, Clemens Wyss DEV wrote:
> At the moment we are using spellchecking-component for suggestions
> which is suboptimal, to say the least. What are best pracitces for
> suggestion
Hi Shalin,
> You can expect as many connection evictor threads
I have (whysoever (*)) 27 SolrClient instances instantiated but I see ~95
"Connection Evictor" threads ...
>It turns out that I made a mistake in the patch I committed in...which names
>threads like pool-123-thread-1282.
>So if you
At the moment we are using spellchecking-component for suggestions which is
suboptimal, to say the least. What are best pracitces for suggestions using
Solr?
googling (with excellent suggestions 😉) I came along
https://blog.trifork.com/2012/02/15/different-ways-to-make-auto-suggestions-with-sol
On 10/22/2018 6:15 AM, Shawn Heisey wrote:
> autoSoftCommit is pretty aggressive . If your commits are taking 1-2 seconds
> or les
well, some take minutes (re-index)!
> autoCommit is quite long. I'd probably go with 60 seconds
Which means every 1min the "pending"/"soft" commits are effectively s
On 10/21/2018 01:06 PM, Shawn Heisey wrote:
> You do it with the request, not with the client
For the UpdateRequests it is the "commitWithinMs"-parameter? To me this
parameter sounds like telling the solr-server I need to see this data within "x
ms". As we have autoCommit and autoSoftCommit
...
Von: Shawn Heisey
Gesendet: Sonntag, 21. Oktober 2018 19:13
An: solr-user@lucene.apache.org
Betreff: Re: 6.6 -> 7.5 SolrJ, seeing many "Connection evictor"-Threads
On 10/21/2018 10:13 AM, Clemens Wyss DEV wrote:
> Just upgrading from 6.6 to 7.5 and am now seeing many "Conne
Just upgrading from 6.6 to 7.5 and am now seeing many "Connection
evcitor"-threads which are all Thread.slee()ing ...
As of 6.6 I am keeping the SolrClients (one per core) in a HashMap. Is this ok
or should I create a new SolrClient for each request I am doing?
SolrClient creation is as follows
then it's a feature of
spellchecking
-Ursprüngliche Nachricht-
Von: Mikhail Khludnev
Gesendet: Montag, 3. September 2018 13:17
An: solr-user
Betreff: Re: Solr suggestions: why are exact matches omitted
I'm afraid only thorough debugging might answer.
On Mon, Sep 3, 2018 at 1
Sorry for not giving up on this issue:
is this "behavior" a feature or a bug?
-Ursprüngliche Nachricht-
Von: Clemens Wyss DEV
Gesendet: Donnerstag, 30. August 2018 18:01
An: 'solr-user@lucene.apache.org'
Betreff: Solr suggestions: why are exact matches omitted
Or do the spellcheck results give an indication that "11000.35" has an exact
match?
-Ursprüngliche Nachricht-
Von: Clemens Wyss DEV
Gesendet: Donnerstag, 30. August 2018 18:01
An: 'solr-user@lucene.apache.org'
Betreff: Solr suggestions: why are exact matche
Given the following configuration:
...
suggest_word_fuzzy
org.apache.solr.spelling.suggest.Suggester
org.apache.solr.spelling.suggest.fst.FuzzyLookupFactory
true
_my_suggest_word
2
What is the proposed way to get/build a SolrClient(-connection) via
HttpClientUtil
- respecting a given connection and response (socket) timeout (ROP_SO_TIMEOUT,
PROP_CONNECTION_TIMEOUT)
- making reuse of underlying http client (pooling?)
- what else makes sense (such as PROP_ALLOW_COMPRESSION, P
+1 ;)
-Ursprüngliche Nachricht-
Von: Susheel Kumar
Gesendet: Freitag, 3. August 2018 14:40
An: solr-user@lucene.apache.org
Betreff: Re: indexing two words, searching single word
and as you suggested, use stop word before shingles...
On Fri, Aug 3, 2018 at 8:10 AM, Clemens Wyss DEV
seems to "work"
-Ursprüngliche Nachricht-
Von: Clemens Wyss DEV
Gesendet: Freitag, 3. August 2018 13:46
An: solr-user@lucene.apache.org
Betreff: AW: indexing two words, searching single word
>Because you probably are not looking for "andthe" kind o
hat is your generic problem then. Because you probably are not looking for
"andthe" kind of tokens.
However a shingle plus regex to remove whitespace can give you "anytwo
wordstogether smooshed" tokens in the index.
Regards,
Alex
On Fri, Aug 3, 2018, 7:19 AM Clemens
Hi Markus,
thanks for the quick answer.
"sound stage" was just an example. We are looking for a generic solution ...
Is it "ok" to apply an NGRamFilter for query-analyzing?
I guess (besides the performance impact) this reduces search results accuracy?
-Clemens
---
Sounds like a rather simple issue:
if I index "sound stage" and search for "soundstage" I get no hits
What am I doing wrong
a) when indexing
b) when searching
?
Thx in advance
- Clemens
> I almost guarantee that buildOnCommit will be unsatisfactory
if not "on commit" when should suggestions/spellcheckings be updated? And how?
Spellchecking/suggestions@solr:
what are the best (up-to-date) sources/links for spellchecking and suggestions?
-Ursprüngliche Nachricht-
Von: E
uppose you could structure your test this way:
index 100 docs, 3 of them have a specific term.
Set your threshold to 2%
Check that the term is suggested
index 100 more docs
Check that the term is _not_ suggested.
Best,
Erick
On Sun, Jan 28, 2018 at 7:24 AM, Clemens Wyss DEV wrote:
> I must
l give us a clue what's going on.
3> Show us the code if you can.
Best,
Erick
On Sat, Jan 27, 2018 at 6:55 AM, Clemens Wyss DEV wrote:
> Erick said/wrote:
>> If you commit after docs are deleted and _still_ see them in search
>> results, that's a JIRA
> should I
org
Betreff: Re: AW: AW: SolrClient#updateByQuery?
On 1/27/2018 12:49 AM, Clemens Wyss DEV wrote:
> Thanks for all these (main contributor's 😉) valuable inputs!
>
> First thing I did was getting getting rid of "expungeDeletes". My
> "single-deletion"
make sense or should JIRA it?
How expensive is this "optimization"?
BTW: we are on Solr 6.6.0
-Ursprüngliche Nachricht-
Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
Gesendet: Samstag, 27. Januar 2018 08:50
An: 'solr-user@lucene.apache.org'
Betreff: AW: AW: So
his make sense or should JIRA it?
How expensive ist this "optimization"?
-Ursprüngliche Nachricht-
Von: Shawn Heisey [mailto:apa...@elyograg.org]
Gesendet: Samstag, 27. Januar 2018 00:49
An: solr-user@lucene.apache.org
Betreff: Re: AW: SolrClient#updateByQuery?
On 1/26/2018
dates.
HTH,
Emir
--
Monitoring - Log Management - Alerting - Anomaly Detection Solr & Elasticsearch
Consulting Support Training - http://sematext.com/
> On 26 Jan 2018, at 17:10, Clemens Wyss DEV wrote:
>
> SolrClient has the method(s) deleteByQuery (which I make use of when I need
> to
SolrClient has the method(s) deleteByQuery (which I make use of when I need to
reindex).
#updateByQuery does nicht exist. What if I want to "update all documents
matching a query"?
Thx
Clemens
.
https://lucene.apache.org/solr/guide/7_2/field-properties-by-use-case.html
wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/ (my blog)
> On Jan 17, 2018, at 11:23 PM, Clemens Wyss DEV wrote:
>
> Kind of "basic question" ... Am I right, that the only r
Kind of "basic question" ... Am I right, that the only real reason to store a
field (stored="true") is when I want to fetch the "originating value" from
documents returned?
What about
geo-location-fields?
Any other reason/(search-)function requiring a field being stored?
Thx
Clemens
Sorry for "re-asking". Anybody else facing this issue (bug?), or can anybody
provide an advice "where to look"?
Thx
Clemens
-Ursprüngliche Nachricht-----
Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
Gesendet: Mittwoch, 1. November 2017 11:06
An: 'solr-user@lu
Context: solr 6.6.0
Im switching my schemas from derprecated solr.LatLonType to
solr.LatLonPointSpatialField. Now my sortquery (which used to work with
solr.LatLonType):
sort=geodist(b4_location__geo_si,47.36667,8.55) asc
raises the error
"sort param could not be parsed as a query, and is not
I am (seldom) seeing NPEs at line 610 of HttpSolrClient:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error
from server at http://xxx.xxx.x.xxx:8983/solr/core1:
java.lang.NullPointerException
at
org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(Http
I am seeing many exceptions like this in my Solr [5.4.1] log:
null:java.lang.StringIndexOutOfBoundsException: String index out of range: -2
at
java.lang.AbstractStringBuilder.replace(AbstractStringBuilder.java:824)
at java.lang.StringBuilder.replace(StringBuilder.java:262)
does
http://localhost:8983/solr/admin/cores?action=RELOAD
reload all cores?
Thx
Clemens
I am still using 5.4.1 and have the following code to create a new core:
...
Properties coreProperties = new Properties();
coreProperties.setProperty( CoreDescriptor.CORE_CONFIGSET, configsetToUse );
CoreDescriptor coreDescriptor = new CoreDescriptor( container, coreName,
coreFolder, corePropertie
: Re: AW: AW: OutOfMemory when batchupdating from SolrJ
On 2/22/2016 1:55 AM, Clemens Wyss DEV wrote:
> SolrClient solrClient = getSolrClient( coreName, true );
> Collection batch = new
> ArrayList(); while ( elements.hasNext() ) {
> IIndexableElement elem = elements.next();
> solrClient.add( documents ); // [2]
is of course:
solrClient.add( batch ); // [2]
-Ursprüngliche Nachricht-
Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
Gesendet: Montag, 22. Februar 2016 09:55
An: solr-user@lucene.apache.org
Betreff: AW: AW: OutOfMemory when batchupdating f
hen batchupdating from SolrJ
On 2/19/2016 3:08 AM, Clemens Wyss DEV wrote:
> The logic is somewhat this:
>
> SolrClient solrClient = new HttpSolrClient( coreUrl ); while ( got
> more elements to index ) {
> batch = create 100 SolrInputDocuments
> solrClient.add( batch )
> }
ceProblems
Thanks,
Susheel
On Fri, Feb 19, 2016 at 9:17 AM, Clemens Wyss DEV
wrote:
> > increase heap size
> this is a "workaround"
>
> Doesn't SolrClient free part of its buffer? At least documents it has
> sent to the Solr-Server?
>
> -
rk/is the issue.
java -Xmx4096m
Thanks,
Susheel
On Fri, Feb 19, 2016 at 6:25 AM, Clemens Wyss DEV
wrote:
> Guessing on ;) :
> must I commit after every "batch", in order to force a flushing of
> org.apache.solr.client.solrj.request.RequestWriter$LazyContentStream et al?
>
ing-transaction-logs-softcommit-and-commit-in-sorlcloud/
'Be very careful committing from the client! In fact, don’t do it'
I would not want to commit "just to flush a client side buffer" ...
-Ursprüngliche Nachricht-
Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
Gese
ore 'fust-1-fr_CH_1' -3-thread-1 Thread
And there is another byte[] with 260MB.
The logic is somewhat this:
SolrClient solrClient = new HttpSolrClient( coreUrl );
while ( got more elements to index )
{
batch = create 100 SolrInputDocuments
solrClient.add( batch )
}
-Ursprü
Environment: Solr 5.4.1
I am facing OOMs when batchupdating SolrJ. I am seeing approx 30'000(!)
SolrInputDocument instances, although my batchsize is 100. I.e. I call
solrClient.add( documents ) for every 100 documents only. So I'd expect to see
at most 100 SolrInputDocument's in memory at any
too much time on a
significantly-sized corpus to be feasible. At least that's my fear, I'm mostly
advising you to check this before even trying to scale up.
Best,
Erick
On Wed, Feb 3, 2016 at 11:07 PM, Clemens Wyss DEV wrote:
> Sorry for coming back to this topic:
> You (Erick
mailto:erickerick...@gmail.com]
Gesendet: Montag, 4. Januar 2016 17:36
An: solr-user
Betreff: Re: Hard commits, soft commits and transaction logs
As far as I know. If you see anything different, let me know and we'll see if
we can update it.
Best,
Erick
On Mon, Jan 4, 2016 at 1:34 AM, Cl
[Happy New Year to all]
Is all herein
https://lucidworks.com/blog/2013/08/23/understanding-transaction-logs-softcommit-and-commit-in-sorlcloud/
mentioned/recommended still valid for Solr 5.x?
- Clemens
er
Betreff: Re: Is it possible to sort on a BooleanField?
Please share your schema.
On Thu, Dec 3, 2015 at 11:28 AM, Clemens Wyss DEV
wrote:
> Looks like not. I get to see
> 'can not sort on a field which is neither indexed nor has doc values:
> '
>
> - Clemens
>
Looks like not. I get to see
'can not sort on a field which is neither indexed nor has doc values: '
- Clemens
t. Don't worry too much, the title & etc. can be
changed after as things become clearer.
Best,
Erick
On Wed, Jun 3, 2015 at 5:58 AM, Clemens Wyss DEV wrote:
> Hi Mark,
> what exactly should I file? What needs to be added/appended to the issue?
>
> Regards
> Clemens
e a JIRA issue please. That OOM Exception is getting wrapped in a
> RuntimeException it looks. Bug.
>
> - Mark
>
>
> On Wed, Jun 3, 2015 at 2:20 AM Clemens Wyss DEV
> wrote:
>
>> Context: Lucene 5.1, Java 8 on debian. 24G of RAM whereof 16G
>> available for Sol
ograg.org]
Gesendet: Mittwoch, 3. Juni 2015 09:16
An: solr-user@lucene.apache.org
Betreff: Re: Solr OutOfMemory but no heap and dump and oo_solr.sh is not
triggered
On 6/3/2015 12:20 AM, Clemens Wyss DEV wrote:
> Context: Lucene 5.1, Java 8 on debian. 24G of RAM whereof 16G available for
>
Context: Lucene 5.1, Java 8 on debian. 24G of RAM whereof 16G available for
Solr.
I am seeing the following OOMs:
ERROR - 2015-06-03 05:17:13.317; [ customer-1-de_CH_1]
org.apache.solr.common.SolrException; null:java.lang.RuntimeException:
java.lang.OutOfMemoryError: Java heap space
a
take many minutes so I recommend against these options for a large
index, and strongly recommend you test these with a large corpus.
Best,
Erick
On Mon, Jun 1, 2015 at 4:01 AM, Clemens Wyss DEV wrote:
> Lucene 5.1:
> I am (also) facing
> "java.lang.IllegalStateException: sug
Lucene 5.1:
I am (also) facing
"java.lang.IllegalStateException: suggester was not built"
At the very moment no new documents seem tob e added to the index/core. Will a
reboot "sanitize" the index/core?
I (still) have
name="buildOnCommit">true
How can I tell Solr to peridoically update the s
26, 2015 at 9:15 AM, Clemens Wyss DEV wrote:
> I also noticed that (see my post this "morning") ...
> SOLR_OPTS="$SOLR_OPTS -Dsolr.allow.unsafe.resourceloading=true"
> ...
> Is not taken into consideration (anymore). Same "bug"?
>
>
> -
I also noticed that (see my post this "morning")
...
SOLR_OPTS="$SOLR_OPTS -Dsolr.allow.unsafe.resourceloading=true"
...
Is not taken into consideration (anymore). Same "bug"?
-Ursprüngliche Nachricht-
Von: Ere Maijala [mailto:ere.maij...@helsinki.fi]
Gesendet: Mittwoch, 15. April 2015 0
tp://www.lucidworks.com
<http://www.lucidworks.com/>
> On May 13, 2015, at 3:49 AM, Clemens Wyss DEV wrote:
>
> I'd like to make use of solr.allow.unsafe.resourceloading=true.
> Is the commandline "-D solr.allow.unsafe.resourceloading=true" the only way
> to inject
ore/schema?wt=schema.xml&indent=on”:
——
_my_id
——
Steve
> On May 15, 2015, at 8:57 AM, Clemens Wyss DEV wrote:
>
> Thought about that too (should have written ;) ).
> When I remove the schema-tag from the composite xml I get:
> org.apache.solr.common.S
schema file
<- the included schema-common.xml
tags from your schema-common.xml. You won’t be able to use
it alone in that case, but if you need to do that, you could just create
another schema file that includes it inside wrapping tags.
Steve
> On May 15, 2015, at 4:01
Given the following schema.xml
_my_id
When I try to include the very schema from another schema file, e.g.:
http://www.w3.org/2001/XInclude"/>
I get SolrException
copyField source :'_my_title' is not a glob and doesn't match any explicit
f
I'd like to make use of solr.allow.unsafe.resourceloading=true.
Is the commandline "-D solr.allow.unsafe.resourceloading=true" the only way to
inject/set this property or can it be done (e.g.) in solr.xml ?
Thx
Clemens
/2012/02/14/indexing-with-solrj/
Best,
Erick
On Fri, May 8, 2015 at 7:30 AM, Clemens Wyss DEV wrote:
> On one of my fields (the "phrase suggestion" field) has 30'860'099 terms. Is
> this "too much"?
> Another field (the "single word suggestion") h
On one of my fields (the "phrase suggestion" field) has 30'860'099 terms. Is
this "too much"?
Another field (the "single word suggestion") has 2'156'218 terms.
-----Ursprüngliche Nachricht-
Von: Clemens Wyss DEV [mailto:clemens...@mysign.
Context: Solr/Lucene 5.1
Is there a way to determine documents that occupy alot "space" in the index. As
I don't store any fields that have text, it must be the terms extracted from
the documents occupying the space.
So my question is: which documents occupy a most space in the inverted index?
79
If you don't use the suggest component, the easiest fix is to comment it out.
-Yonik
On Sun, May 3, 2015 at 1:11 PM, Clemens Wyss DEV wrote:
> I guess it's the "searcherExecutor-7-thread-1 (30)" which seems to be loadi
rCore.java:1751)
java.util.concurrent.FutureTask.run(Unknown Source)
java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
java.lang.Thread.run(Unknown Source)
-Ursprüngliche Nachricht-----
Von: Clemens Wyss DEV [mailto:cleme
about
what the system is doing.
You can get the same information from the command line using
# jstack (pid) > output.log
Best,
Andrea
On 3 May 2015 18:53, "Clemens Wyss DEV"
mailto:clemens...@mysign.ch>> wrote:
> Just opened the very core in a "normal"
Just opened the very core in a "normal" Solr server instance. Same delay till
it's usable. I.e. nothing to do with embedded-mode or any other thread slowing
down things
-Ursprüngliche Nachricht-----
Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
Gesendet: Sonntag, 3. Ma
ed with Tika.
-Ursprüngliche Nachricht-
Von: Yonik Seeley [mailto:ysee...@gmail.com]
Gesendet: Sonntag, 3. Mai 2015 17:53
An: solr-user@lucene.apache.org
Betreff: Re: "blocked" in org.apache.solr.core.SolrCore.getSearcher(...) ?
What are the other threads doing during this time?
-Yo
> more than 15 minutes
It took 37minutes!
-Ursprüngliche Nachricht-
Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
Gesendet: Sonntag, 3. Mai 2015 10:00
An: solr-user@lucene.apache.org
Betreff: "blocked" in org.apache.solr.core.SolrCore.getSearcher(...) ?
Cont
Context: Solr 5.1, EmbeddedSolrServer(-mode)
I have a rather big index/core (>1G). I was able to initially index this core
and could then search within it. Now when I restart my app I am no more able to
search.
getSearcher seems to "hang"... :
java.lang.Object.wait(long) line: not available [n
If I run Solr in ebmedded mode (which I shouldn't, I know ;) ) how do I know
(event?) that the cores are up-and-running, i.e. all is initialized?
Thx
Clemens
Thx. It would be helpful if we'd see the originating request URL for this error
- Clemens
PS: Last time I saw "Hoss" was when watching Bonanza as a kid ;)
-Ursprüngliche Nachricht-
Von: Chris Hostetter [mailto:hossman_luc...@fucit.org]
Gesendet: Freitag, 24. April 2015 19:15
An: solr-user
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Unknown Source)
What are possible reasons herfore?
Thx
Clemens
-Ursprüngliche Nachricht-
Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
Gesendet: Freitag, 24. April 2015 14:01
An: solr-user@lucene.apache.org
Context: Solr/Lucene 5.1
Adding documents to Solr core/index through SolrJ
I extract pdf's using tika. The pdf-content is one of the fields of my
SolrDocuments that are transmitted to Solr using SolrJ.
As not all documents seem to be "coming through" I looked into the Solr-logs
and see the follw
I am seeing the following stacktrace(s):
Caused by: java.lang.IllegalArgumentException: Unknown type of result: class
javax.xml.transform.dom.DOMResult
at
net.sf.saxon.event.SerializerFactory.getReceiver(SerializerFactory.java:154)
~[netcdfAll.jar:4.5.4]
at
net.sf.saxon.Identity
facet15__d_i:[2.0 TO *] don;t you?
On Tue, Mar 3, 2015 at 12:00 PM, Clemens Wyss DEV
wrote:
> [Solr 5.0]
> Whereas in
>
> fq={!tag="facet15"}facet15__d_i:1.8 facet15__d_i:2.2
> &q=(*:*)
> &facet=true
> &facet.mincount=1
> &facet.field={!key="fa
[Solr 5.0]
Whereas in
fq={!tag="facet15"}facet15__d_i:1.8 facet15__d_i:2.2
&q=(*:*)
&facet=true
&facet.mincount=1
&facet.field={!key="facet15" ex="facet15"}facet15__d_i
"facet15" is not affected by the fq (as desired). This does not hold true for
the facet.query
fq={!tag="till2"}facet15__d_i:[
lder version of noggit around. You need
version 0.6.
Alan Woodward
www.flax.co.uk
On 23 Feb 2015, at 13:00, Clemens Wyss DEV wrote:
> Just about to upgrade to Solr5. My UnitTests fail:
> 13:50:41.178 [main] ERROR org.apache.solr.core.CoreContainer - Error
> creating
Just about to upgrade to Solr5. My UnitTests fail:
13:50:41.178 [main] ERROR org.apache.solr.core.CoreContainer - Error creating
core [1-de_CH]: null
java.lang.ExceptionInInitializerError: null
at
org.apache.solr.core.SolrConfig.getConfigOverlay(SolrConfig.java:359)
~[solr-core.jar:5.0.0
Does Solr provider a (Java)constant for "the name of the version field" (ie
_version_)?
"coreContainer.getCoresLocator().create(coreContainer,
dcore);"
When doing the two calls:
a) Core.properties is being created
AND
b) the cores are being loaded upon container-startup ;)
:-)
-Ursprüngliche Nachricht-
Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
Gesendet: F
etreff: Re: AW: AW: CoreContainer#createAndLoad, existing cores not loaded
On 1/29/2015 10:15 AM, Clemens Wyss DEV wrote:
>> to put your solr home inside the extracted WAR
> We are NOT using war's
>
>> coreRootDirectory
> I don't have this property in my sorl.xml
>
>
going the "bleeding edge"-way!
-Ursprüngliche Nachricht-
Von: Shawn Heisey [mailto:apa...@elyograg.org]
Gesendet: Donnerstag, 29. Januar 2015 18:10
An: solr-user@lucene.apache.org
Betreff: Re: AW: CoreContainer#createAndLoad, existing cores not loaded
On 1/29/2015 12:13 AM, Clemen
uar 2015 18:13
An: solr-user@lucene.apache.org
Betreff: Re: AW: Building Solr 5 from svn sources
On 1/29/2015 10:03 AM, Clemens Wyss DEV wrote:
> Why are solr*jars not being built? All others (including lucene) are built.
What steps are you taking, and what is not there that you expect to be there?
Thanks,
Shawn
oad, existing cores not loaded
On 1/29/2015 12:08 AM, Clemens Wyss DEV wrote:
> Thx Shawn. I am running latest-greatest Solr (4.10.3) Solr home is
> e.g.
> /opt/webs//WebContent/WEB-INF/solr
> the core(s) reside in
> /opt/webs//WebContent/WEB-INF/solr/cores
> Should these be found
Why are solr*jars not being built? All others (including lucene) are built.
-Ursprüngliche Nachricht-
Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
Gesendet: Donnerstag, 29. Januar 2015 13:47
An: solr-user@lucene.apache.org
Betreff: AW: Building Solr 5 from svn sources
Thx
lucene_solr_5_0
http://mail-archives.apache.org/mod_mbox/lucene-dev/201501.mbox/%3CCAKiERN4-qbj7BF%3DJgui4xUFKujwuP%2BodkZPesVT51xnXG1om_w%40mail.gmail.com%3E
2015-01-29 19:29 GMT+09:00 Clemens Wyss DEV :
> Looks like trunk is Solr 6?
> Should I build Solr 5 from
> http://svn.apache.org/repos/a
Looks like trunk is Solr 6?
Should I build Solr 5 from
http://svn.apache.org/repos/asf/lucene/dev/branches/branch_5x
https://issues.apache.org/jira/browse/SOLR-6718
looks like I am not alone with my "weird" questions/ideas ;)
And I should really switch over to 5 ;)
-Ursprüngliche Nachricht-
Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
Gesendet: Donnerstag, 29. Januar 2015 08:08
An:
BTW:
None of my core folders contains a core.properties file ... ? Could it be due
to the fact that I am (so far) running only EmbeddedSolrServer, hence no real
Solr-Server?
-Ursprüngliche Nachricht-
Von: Clemens Wyss DEV [mailto:clemens...@mysign.ch]
Gesendet: Donnerstag, 29. Januar
-user@lucene.apache.org
Betreff: Re: CoreContainer#createAndLoad, existing cores not loaded
On 1/28/2015 8:52 AM, Clemens Wyss DEV wrote:
> My problem:
> I create cores dynamically using container#create( CoreDescriptor ) and then
> add documents to the very core(s). So far so go
My problem:
I create cores dynamically using container#create( CoreDescriptor ) and then
add documents to the very core(s). So far so good.
When I restart my app I do
container = CoreContainer#createAndLoad(...)
but when I then call container.getAllCoreNames() an empty list is returned.
What core
https://issues.apache.org/jira/browse/LUCENE-5820
Due to the missing factory the SuggestStopFilter is not "usable" before
Solr/Lucene 5, right?
Any plan on when Solr 5 will appear?
How can I get hold of Solr/Lucene 5?
ar 2015 15:24
An: solr-user@lucene.apache.org
Betreff: Re: AW: AW: transactions@Solr(J)
On 1/20/2015 11:42 PM, Clemens Wyss DEV wrote:
> But then what happens if:
> Autocommit is set to 10 docs
> and
> I add 11 docs and then decide (due to an exception?) to rollback.
>
> Will o
AM, Clemens Wyss DEV wrote:
> Thanks Mike,
>> but a key difference is that when one client commits, all clients
>> will see the updates
> That's ok.
>
> What about the -setting(s) in solrconfig.xml. Doesn't this mean
> that after adding x elements (or after
.org
Betreff: Re: transactions@Solr(J)
On 1/20/2015 5:18 AM, Clemens Wyss DEV wrote:
> http://stackoverflow.com/questions/10805117/solr-transaction-managemen
> t-using-solrj Is it true, that a SolrServer-instance denotes a
> "transaction context"?
>
> Say I have two concurrent t
solrJ
workaround.
Ahmet
On Tuesday, January 20, 2015 2:22 PM, Clemens Wyss DEV
wrote:
Thx, but sorry for asking:
what is the SolrJ corresponding command?
SolrServer#commit()
SolrServer# commit( boolean waitFlush, boolean waitSearcher ) SolrServer#
commit( boolean waitFlush, boolean
[mailto:iori...@yahoo.com.INVALID]
Gesendet: Dienstag, 20. Januar 2015 13:14
An: solr-user@lucene.apache.org
Betreff: Re: AW: TermsComonent, buildOnCommit?
Hi,
curl http://localhost:8983/solr/core/update?commit=true&expungeDeletes=true
ahmet
On Tuesday, January 20, 2015 1:51 PM, Clemens Wyss DEV
w
Commit?
Hi,
Deleted terms could confuse you. commit with expunge deletes or optimise will
purge deleted terms.
Ahmet
On Tuesday, January 20, 2015 1:03 PM, Clemens Wyss DEV
wrote:
Does the TermsComponent (/terms) have something like buildOnCommit ? Or is it
always up-to-date (<- my unit t
Does the TermsComponent (/terms) have something like buildOnCommit ? Or is it
always up-to-date (<- my unit tests deny this)?
http://stackoverflow.com/questions/10805117/solr-transaction-management-using-solrj
Is it true, that a SolrServer-instance denotes a "transaction context"?
Say I have two concurrent threads, each having a SolrServer-instance "pointing"
to the same core. Then each thread can add/update/delete do
1 - 100 of 133 matches
Mail list logo