A big thanks to Yonik and Mark. Using the raw term query I was able
to find the range(!) of documents that had bad integer field values.
Deleting those documents, committing and optimizing cleared up the
issue.
Still not sure how the bad values were inserted in the first place,
but that is anothe
On Fri, Feb 26, 2010 at 10:59 AM, Mark Miller wrote:
> You have to find the document with the bad value somehow.
>
> In the past I have used Luke to help with this.
>
> Then you need to delete the document.
You can also find the document with a raw term query.
q={!raw f=myfield}104708<
-Yonik
h
You have to find the document with the bad value somehow.
In the past I have used Luke to help with this.
Then you need to delete the document.
Finally, you have to get the deleted document out of the index through a
merge (else the bad term will still be loaded by the FieldCache) -
easiest w
Thank you for taking the time to look at my issue and respond.
Do you have any suggestions for purging the document with this field
from the index? Would that even help?
I do not know which document has the corrupt value, and searching for
the document with something like
pk_i:104708<
does ret
One of your field values isn't a valid integer, it's "104708<"
You're probably using the straight integer type in 1.3, which is meant
for back compat with existing lucene indexes and currently doesn't do
validation on it's input.
For Solr 1.4, "int" is a new field type (example schema maps it to
T