Thanks for the quick response. Forgive the naïve question, but shouldn¹t it be doing garbage collection automatically? Having to manually force GC via jconsole isn¹t a sustainable solution.
Thanks again, betsey On 4/14/16, 2:54 PM, "Erick Erickson" <erickerick...@gmail.com> wrote: >well, things _are_ running, specifically the communications channels >are looking for incoming messages and the like, generating garbage >etc. > >Try attaching jconsole to the process and hitting the GC button to >force a garbage collection. As long as your memory gets to some level >and drops back to that level after forcing GCs, you'll be fine. > >Best, >Erick > >On Thu, Apr 14, 2016 at 11:45 AM, Betsey Benagh ><betsey.ben...@stresearch.com> wrote: >> X-posted from stack overflow... >> >> I'm running solr 6.0.0 in server mode. I have one core. I loaded about >>2000 documents in, and it was using about 54 MB of memory. No problem. >>Nobody was issuing queries or doing anything else, but over the course >>of about 4 hours, the memory usage had tripled to 152 MB. I shut solr >>down and restarted it, and saw the memory usage back at 54 MB. Again, >>with no queries or anything being executed against the core, the memory >>usage is creeping up - after 17 minutes, it was up to 60 MB. I've looked >>at the documentation for how to limit memory usage, but I want to >>understand why it's creeping up when nothing is happening, lest it run >>out of memory when I limit the usage. The machine is running CentOS 6.6, >>if that matters, with Java 1.8.0_65. >> >> Thanks! >>