Here is what we have:

for all the documents we have a field called "small_body" , which is a 60
chars max text field that were we store the "abstract" for each article.

We have about 8,000,000 documents indexed, and usually we display this
small_body on our "listing pages". 

For each listing page we load 50 documents at the time, that is to say, we
need to display this small_body we want to compress every time.

I'll probably do the compress for this field and run a 1 week test to see
the outcome, roll it back eventually.

Last question: what's the best way to determine the compress threshold ?

Grant Ingersoll-6 wrote:
> 
> 
> On Jun 4, 2009, at 6:42 AM, Erick Erickson wrote:
> 
>>
>> It *will* cause performance issues if you load that field for a large
>> number of documents on a particular search. I know Lucene itself
>> has lazy field loading that helps in this case, but I don't know how
>> to persuade SOLR to use it (it may even lazy-load automatically).
>> But this is separate from searching...
> 
> Lazy loading is an option configured in the solrconfig.xml
> 
> 
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Field-Compression-tp15258669p23879859.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to