Hi,
can someone give me a hint about the meaning of the different colors
inside the cloud admin view graphics? I see oange, yellow, gray colors
but dont find anything in the documentation about the meaning
(especially the gray color:) ).
I have an external zookeeper server with 4 nodes (2 sh
I think this means the pattern did not match any files:
0
The wiki example includes a '^' at the beginning of the filename pattern. This
matches a complete line.
http://wiki.apache.org/solr/DataImportHandler#Transformers_Example
More:
Add rootEntity="true". It cannot hurt to be explicit.
The d
Manish,
Need to set hasSingleNormFile="0" ins schema
On Sun, Nov 18, 2012 at 9:11 AM, Manish Bafna wrote:
> Hi,
> I need to disable HasSingleNormFile in solr, so that multiple norm files
> are created. Can anyone plz provide information on how to disable this in
> solr.
>
>
> If HasSingleNormFile
Hi,
These are the exact steps that I have taken to try and get delta import
handler working. If I can provide any more information to help let me know.
I have literally spent the entire friday night and today on this and I throw
in the towel. Where have I gone wrong?
*Added this line to the solrc
On 11/16/2012 12:30 PM, Shawn Heisey wrote:
I am extremely interested in the Unicode behavior of ICUTokenizer, but
I cannot disable the punctuation-splitting behavior and let WDF handle
it properly, which causes recall problems. There is no filter that I
can run after tokenization, either. Lo
bq. fetch the configuration and store it locally.
New nodes don't fetch the configs and store them locally - configs are
loaded straight from zookeeper currently.
- Mark
Kobayashi-san,
I am sorry but I don't understand your question. There is a book that is an
essential guide for Japanese speaking new Solr users.
http://wiki.apache.org/solr/GijutsuHyohronBook2010
--- On Fri, 11/16/12, alu wrote:
> From: alu
> Subject: Re: Neary text search system with sol
On 11/16/2012 12:52 PM, Shawn Heisey wrote:
On 11/16/2012 12:36 PM, Jack Krupansky wrote:
Generally, you don't need the preserveOriginal attribute for WDF.
Generate both the word parts and the concatenated terms, and queries
should work fine without the original. The separated terms will be
in
You can force Solr to use the new configs by reloading a collection:
http://localhost:8983/solr/admin/collections?action=RELOAD&name=mycollection
This'll cause all shards (and replicas) in a collection to collect new
configs from ZooKeeper.
The main thing to note re Jetty, is that the Jetty incl
Hi Spadez,
Nabble has helpfully stripped out your script. Maybe don't use Nabble?
Steve
On Nov 16, 2012, at 5:06 PM, Spadez wrote:
> Hey guys,
>
> I am after a bash script (or python script) which I can use to trigger a
> delta import of XML files via CRON. After a bit of digging and modific
I would _guess_ (but haven't done this with DIH) that simply putting
the body.chain in the updatehandler ()
would do what you want.
But that's purely a guess at this point on my part.
Anyone want to correct me?
Best
Erick
On Fri, Nov 16, 2012 at 4:50 PM, srinalluri wrote:
> I have a new upd
1> Well, it loads the local conf directory up to zookeeper so new nodes can
fetch the configuration and store it locally.
2> No, you have to upload the configuration to ZK and (I think) restart the
other servers. It's easy enough to test, just make your changes to the
config, upload it, and look at
There was a discussion of this a bit ago, but the upshot is that the
maintainer hasn't released a version compatible with 4.0 yet. Send him
money ...
FWIW,
Erick
On Fri, Nov 16, 2012 at 11:16 AM, Miguel Ángel Martín <
miguelangel.mar...@brainsins.com> wrote:
> hi all:
>
> i can open an index cr
Hmmm, first an aside. If by "commit after every batch of documents " you
mean after every call to server.add(doclist), there's no real need to do
that unless you're striving for really low latency. the usual
recommendation is to use commitWithin when adding and commit only at the
very end of the ru
that's very strange. How much memory are you giving the JVM? And how much
memory is on your machine?
If your index is cutting in half on optimize, then it sounds like you're
re-indexing everything. Optimize will squeeze out all the data left around
by document deletes or updates, so the only reaso
what does "having a problem" mean? Index-time? Query time?
But your problem is most likely the tokenizer as you suspect. Try something
like WhitespaceTokenizer and build up from there.
Three friends:
1> admin/analysis page
2> admin/schema-browser
3> &debugQuery=on
The first will show you what the
16 matches
Mail list logo