I think you're going at this from the wrong end. There's no magical way to give your process more memory than you have on your machine. But, there's no correlation between the size of your data, especially when it contains video files and the size of your Solr index. You're not going to index the binary stuff, just the meta-data. How do you intend to extract it anyway? Tika? If so, what format? I think Tika only supports flash, but that's just a vague memory.
I think you need to back up and explain a bit more about what you intend to accomplish, this feels like an XY problem, see: http://people.apache.org/~hossman/#xyproblem Best Erick On Mon, Oct 3, 2011 at 3:41 AM, hadi <md.anb...@gmail.com> wrote: > > I have a large Data storage(About 500GB pdf and video files) and my machine > have a 4GB of RAM capacity, i want to index these file with solrj API's , > what is the necessary setting for solrconfig and JVM to ignore heap size > problem and other memory crash down during indexing? is it any configuration > to force garbage collector to deallocate the memory during indexing? > > thanks > > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/Memory-managment-and-JVM-setting-for-solr-tp3389093p3389093.html > Sent from the Solr - User mailing list archive at Nabble.com. >