Steve's comment is much more germane. KeywordTokenizer,
used in alphaOnlySort last I knew is not appropriate at all.
Do you really want single tokens that consist of the entire
document for sorting purposes? Wouldn't the first 1K be enough?
It looks like this was put in in 4.0, so I'm guessing you
Erick, thanks for the response. I have a number of documents in our database
where solr is throwing the same exception against *_tsing types.
However, when I index against the same document with our solr 4.7, it is
successfully indexed. So, I assume something is different between 4.7 and
7.3. I wa
On 5/1/2018 8:40 AM, THADC wrote:
> I get the following exception:
>
> *Exception writing document id FULL_36265 to the index; possible analysis
> error: Document contains at least one immense term in
> field="gridFacts_tsing" (whose UTF8 encoding is longer than the max length
> 32766), all of whic
The input in the error message starts “lorem ipsum”, so it contains spaces, but
the alphaOnlySort field type (in Solr’s example schemas anyway) uses
KeywordTokenizer, which tokenizes the entire input as a single token.
As Erick implied, you maybe should not be doing that with this kind of data -
You're sending it a huge term. My guess is you're sending something
like base64-encoded data or perhaps just a single unbroken string in
your field.
Examine your document, it should jump out at you.
Best,
Erick
On Tue, May 1, 2018 at 7:40 AM, THADC wrote:
> Hello,
>
> We are migrating from solr
Hello,
We are migrating from solr 4.7 to 7.3. When I encounter a data item that
matches a custom dynamic field from our 4.7 schema:
**
, I get the following exception:
*Exception writing document id FULL_36265 to the index; possible analysis
error: Document contains at least one immense term in