Hi Erick,
For me, this classCastException is caused by the wrong use of TokenFilter.In
fieldType declaration (schema.xml), i've put :And instead
using TokenizerFactory in my class, i utilize TokenFilterFactory like this
:public class SentenceTokenizerFactory extends TokenFilterFactory
So when
Thanks for letting us know the resolution, the problem was bugging me
Erick
On Wed, Mar 25, 2015 at 4:21 PM, Test Test wrote:
> Re,
> Finally, i think i found where this problem comes.I didn't use the right
> class extender, instead using Tokenizers, i'm using Token filter.
> Eric, thanks f
Re,
Finally, i think i found where this problem comes.I didn't use the right class
extender, instead using Tokenizers, i'm using Token filter.
Eric, thanks for your replies.Regards.
Le Mercredi 25 mars 2015 23h55, Test Test a écrit :
Re,
I have tried to remove all the redundant jar
Re,
I have tried to remove all the redundant jar files.Then i've relaunched it but
it's blocked directly on the same issue.
It's very strange.
Regards,
Le Mercredi 25 mars 2015 23h31, Erick Erickson a
écrit :
Wait, you didn't put, say, lucene-core-4.10.2.jar into your
contrib/tamin
Wait, you didn't put, say, lucene-core-4.10.2.jar into your
contrib/tamingtext/dependency directory did you? That means you have
Lucene (and solr and solrj and ...) in your class path twice since
they're _already_ in your classpath by default since you're running
Solr.
All your jars should be in y
Re,
Sorry about the image.So, there are all my dependencies jar in listing below :
- commons-cli-2.0-mahout.jar
- commons-compress-1.9.jar
- commons-io-2.4.jar
- commons-logging-1.2.jar
- httpclient-4.4.jar
- httpcore-4.4.jar
- httpmime-4.4.jar
Re,
Sorry about the image.So, there are all my dependencies jar in listing below :-
commons-cli-2.0-mahout.jar- commons-compress-1.9.jar- commons-io-2.4.jar-
commons-logging-1.2.jar- httpclient-4.4.jar- httpcore-4.4.jar-
httpmime-4.4.jar- junit-4.10.jar- log4j-1.2.17.jar-
lucene-analyzers-commo
Images don't come through the mailing list, can't see your image.
Whether or not all the jars in the directory you're working on are
consistent is the least of your problems. Are the libs to be found in any
_other_ place specified on your classpath?
Best,
Erick
On Wed, Mar 25, 2015 at 12:36 AM,
Thanks Eric,
I'm working on Solr 4.10.2 and all my dependencies jar seems to be compatible
with this version.
I can't figure out which one make this issue.
ThanksRegards,
Le Mardi 24 mars 2015 23h45, Erick Erickson a
écrit :
bq: 13 moreCaused by: java.lang.ClassCastException: c
bq: 13 moreCaused by: java.lang.ClassCastException: class
com.tamingtext.texttamer.solr.
This usually means you have jar files from different versions of Solr
in your classpath.
Best,
Erick
On Tue, Mar 24, 2015 at 2:38 PM, Test Test wrote:
> Hi there,
> I'm trying to create my own TokenizerFact
Think I figured it out, the tokens just needed the same position attribute.
On Thu, Feb 9, 2012 at 10:38 PM, Jamie Johnson wrote:
> Thanks Robert, worked perfect for the index side of the house. Now on
> the query side I have a similar Tokenizer, but it's not operating
> quite the way I want it
Thanks Robert, worked perfect for the index side of the house. Now on
the query side I have a similar Tokenizer, but it's not operating
quite the way I want it to. The query tokenizer generates the tokens
properly except I'm ending up with a phrase query, i.e. field:"1 2 3
4" when I really want f
On Thu, Feb 9, 2012 at 8:54 PM, Jamie Johnson wrote:
> Again thanks. I'll take a stab at that are you aware of any
> resources/examples of how to do this? I figured I'd start with
> WhiteSpaceTokenizer but wasn't sure if there was a simpler place to
> start.
>
Well, easiest is if you can build
Again thanks. I'll take a stab at that are you aware of any
resources/examples of how to do this? I figured I'd start with
WhiteSpaceTokenizer but wasn't sure if there was a simpler place to
start.
On Thu, Feb 9, 2012 at 8:44 PM, Robert Muir wrote:
> On Thu, Feb 9, 2012 at 8:28 PM, Jamie Johnso
On Thu, Feb 9, 2012 at 8:28 PM, Jamie Johnson wrote:
> Thanks Robert, I'll take a look there. Does it sound like I'm on the
> right the right track with what I'm implementing, in other words is a
> TokenFilter appropriate or is there something else that would be a
> better fit for what I've descr
Thanks Robert, I'll take a look there. Does it sound like I'm on the
right the right track with what I'm implementing, in other words is a
TokenFilter appropriate or is there something else that would be a
better fit for what I've described?
On Thu, Feb 9, 2012 at 6:44 PM, Robert Muir wrote:
> I
If you are writing a custom tokenstream, I recommend using some of the
resources in Lucene's test-framework.jar to test it.
These find lots of bugs! (including thread-safety bugs)
For a filter: I recommend to use the assertions in
BaseTokenStreamTestCase: assertTokenStreamContents, assertAnalyzesT
17 matches
Mail list logo