I am debugging an out of memory error myself and a few suggestions:
1) Are you looking at your search logs around the time of the memory
error? In my case, I found a few bad queries requesting a ton of rows
(basically the whole index's worth which I think is an error somewhere
in our app just have to find it) which happened close to the OOM error
being thrown.
2) Do you have Solr hooked up to something like NewRelic/AppDynamics
to see the cache usage in real time? Maybe as was suggested, tuning
down or eliminating low used caches could help.
3) Are you ensuring that you aren't setting "stored=true" on fields
that don't need it? This will increase the index size and possibly the
cache size if lazy loading isn't enabled (to be honest, this part I am
a bit unclear of since I haven't had much experience with this
myself).

Thanks
Amit

On Mon, Aug 13, 2012 at 11:37 AM, Jon Drukman <jdruk...@gmail.com> wrote:
> On Sun, Aug 12, 2012 at 12:31 PM, Alexey Serba <ase...@gmail.com> wrote:
>
>> > It would be vastly preferable if Solr could just exit when it gets a
>> memory
>> > error, because we have it running under daemontools, and that would cause
>> > an automatic restart.
>> -XX:OnOutOfMemoryError="<cmd args>; <cmd args>"
>> Run user-defined commands when an OutOfMemoryError is first thrown.
>>
>> > Does Solr require the entire index to fit in memory at all times?
>> No.
>>
>> But it's hard to say about your particular problem without additional
>> information. How often do you commit? Do you use faceting? Do you sort
>> by Solr fields and if yes what are those fields? And you should also
>> check caches.
>>
>
> I upgraded to solr-3.6.1 and an extra large amazon instance (15GB RAM) so
> we'll see if that helps.  So far no out of memory errors.

Reply via email to