to behave in
terms of things like phrase query?
You need to be a lot more clear about your use case.
-- Jack Krupansky
-Original Message-
From: vsl
Sent: Wednesday, March 13, 2013 6:11 AM
To: solr-user@lucene.apache.org
Subject: Re: Special characters not indexed
After changing to wh
After changing to white space tokenizer there are still no results for given
search term "&". Only when the whole word ("ยง$ %&/( )=? +*#'-<>") was given
as a search term, this document was shown in results.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Special-characters-n
Just to add to Jack's points, you can also use the term query parser to
avoid all the escaping for special characters, e.g.
fq={!term f=some_field}
See Erik's preso from Apache Eurocon 2012 around 25:50 -
http://vimeopro.com/user11514798/apache-lucene-eurocon-2012/video/55822628
On Tue, Mar 12,
Use the white space tokenizer and be sure to escape a lot of them in queries
since a number of them have meaning to the query parser. Or, enclose query
terms in quotes.
-- Jack Krupansky
-Original Message-
From: vsl
Sent: Tuesday, March 12, 2013 11:16 AM
To: solr-user@lucene.apache.o