Rahul,

Here's a suggestion:
Write a simple app that uses *Lucene* to create N indices, one for each of the 
documents you want to test.  Then you can look at their sizes on disk.

Not sure if it's super valuable to see sizes of individual documents, but you 
can do it as described above.
Of course, if you *store* all your data, the index will be bigger than the 
original/input data.

Otis
----
Sematext :: http://sematext.com/ :: Solr - Lucene - Nutch
Lucene ecosystem search :: http://search-lucene.com/



----- Original Message ----
> From: rahul <asharud...@gmail.com>
> To: solr-user@lucene.apache.org
> Sent: Tue, April 19, 2011 7:49:39 AM
> Subject: Solr indexing size for a particular document.
> 
> Hi,
> 
> Is there a way to find out Solr indexing size for a particular  document. I
> am using Solrj to index the documents. 
> 
> Assume, I am  indexing multiple fields like title, description, content, and
> few integer  fields in schema.xml, then once I index the content, is there a
> way to  identify the index size for the particular document during indexing
> or after  indexing..??
> 
> Because, most of the common words are excluded from  StopWords.txt using
> StopFilterFactory. I just want to calculate the actual  index size of the
> particular document. Is there any way in current Solr  ??
> 
> thanks,
> 
> 
> --
> View this message in context: 
>http://lucene.472066.n3.nabble.com/Solr-indexing-size-for-a-particular-document-tp2838416p2838416.html
>
> Sent  from the Solr - User mailing list archive at Nabble.com.
> 

Reply via email to