Hi, I am trying to index a heavy dataset with 1 particular field really too heavy...
However, As I start, I get Memory warning and rollback (OutOfMemoryError). So, I have learned that we can use -Xmx1024m option with java command to start the solr and allocate more memory to the heap. My question is, that since this could also become insufficient later, so it the issue related to cacheing? here is my cache block in solrconfig: <filterCache class="solr.FastLRUCache" size="512" initialSize="512" autowarmCount="0"/> <queryResultCache class="solr.LRUCache" size="512" initialSize="512" autowarmCount="0"/> <documentCache class="solr.LRUCache" size="512" initialSize="512" autowarmCount="0"/> I am thinking like maybe I need to turn of the cache for "documentClass". Anyone got a better idea? Or perhaps there is another issue here? Just to let you know, until I added that very heavy db field for indexing, everything was just fine... -- Regards, Raheel Hasan