Re: OutOfMemoryError in 6.5.1

2017-11-29 Thread Walter Underwood
No faceting. Highlighting. We have very long queries, because students are pasting homework problems. I’ve seen 1000 word queries, but we truncate at 40 words. We do as-you-type results, so we also have ngram fields on the 20 million solved homework questions. This bloats the index severely. A

Re: OutOfMemoryError in 6.5.1

2017-11-29 Thread Toke Eskildsen
Walter Underwood wrote: > I knew about SOLR-7433, but I’m really surprised that 200 incoming requests > can need 4000 threads. > > We have four shards. For that I would have expected at most 800 Threads. Are you perhaps doing faceting on multiple fields with facet.threads=5? (kinda grasping at

Re: OutOfMemoryError in 6.5.1

2017-11-29 Thread Walter Underwood
I knew about SOLR-7433, but I’m really surprised that 200 incoming requests can need 4000 threads. We have four shards. Why is there a thread per shard? HTTP can be done async: send1, send2, send3, send4, recv1 recv2, recv3, recv4. I’ve been doing that for over a decade with HTTPClient. wunde

Re: OutOfMemoryError in 6.5.1

2017-11-29 Thread Toke Eskildsen
Walter Underwood wrote: > I set this in jetty.xml, but it still created 4000 threads. > > > > > > That sets a limit on the number of threads started by Jetty to handle incoming connections, but does not affect how many threads Solr can create. I guess you have ~20 shards in your

Re: OutOfMemoryError in 6.5.1

2017-11-28 Thread Walter Underwood
I’m pretty sure these OOMs are caused by uncontrolled thread creation, up to 4000 threads. That requires an additional 4 Gb (1 Meg per thread). It is like Solr doesn’t use thread pools at all. I set this in jetty.xml, but it still created 4000 threads. wunder Walter Underwood wun.

Re: OutOfMemoryError in 6.5.1

2017-11-23 Thread Damien Kamerman
I found the suggesters very memory hungry. I had one particularly large index where the suggester should have been filtering a small number of docs, but was mmap'ing the entire index. I only ever saw this behavior with the suggesters. On 22 November 2017 at 03:17, Walter Underwood wrote: > All o

Re: OutOfMemoryError in 6.5.1

2017-11-21 Thread Shawn Heisey
On 11/21/2017 9:17 AM, Walter Underwood wrote: > All our customizations are in solr.in.sh. We’re using the one we configured > for 6.3.0. I’ll check for any differences between that and the 6.5.1 script. The order looks correct to me -- the arguments for the OOM killer are listed *before* the "-j

Re: OutOfMemoryError in 6.5.1

2017-11-21 Thread Erick Erickson
Walter: Yeah, I've seen this on occasion. IIRC, the OOM exception will be specific to running out of stack space, or at least slightly different than the "standard" OOM error. That would be the "smoking gun" for too many threads Erick On Tue, Nov 21, 2017 at 9:00 AM, Walter Underwood wrote:

Re: OutOfMemoryError in 6.5.1

2017-11-21 Thread Walter Underwood
I do have one theory about the OOM. The server is running out of memory because there are too many threads. Instead of queueing up overload in the load balancer, it is queue in new threads waiting to run. Setting solr.jetty.threads.max to 10,000 guarantees this will happen under overload. New R

Re: OutOfMemoryError in 6.5.1

2017-11-21 Thread Erick Erickson
bq: but those use analyzing infix, so they are search indexes, not in-memory Sure, but they still can consume heap. Most of the index is MMapped of course, but there are some control structures, indexes and the like still kept on the heap. I suppose not using the suggester would nail it though.

Re: OutOfMemoryError in 6.5.1

2017-11-21 Thread Walter Underwood
All our customizations are in solr.in.sh. We’re using the one we configured for 6.3.0. I’ll check for any differences between that and the 6.5.1 script. I don’t see any arguments at all in the dashboard. I do see them in a ps listing, right at the end. java -server -Xms8g -Xmx8g -XX:+UseG1GC -X

Re: OutOfMemoryError in 6.5.1

2017-11-21 Thread Shawn Heisey
On 11/20/2017 6:17 PM, Walter Underwood wrote: When I ran load benchmarks with 6.3.0, an overloaded cluster would get super slow but keep functioning. With 6.5.1, we hit 100% CPU, then start getting OOMs. That is really bad, because it means we need to reboot every node in the cluster. Also,

Re: OutOfMemoryError in 6.5.1

2017-11-20 Thread Bernd Fehling
Hi Walter, you can check if the JVM OOM hook is acknowledged by JVM and setup in the JVM. The options are "-XX:+PrintFlagsFinal -version" You can modify your bin/solr script and tweak the function "launch_solr" at the end of the script. Replace "-jar start.jar" with "-XX:+PrintFlagsFinal -versio

Re: OutOfMemoryError and Too many open files

2017-05-08 Thread Erick Erickson
Solr/Lucene really like having a bunch of files available, so bumping the ulimit is often the right thing to do. This assumes you don't have any custom code that is failing to close searchers and the like. Best, Erick On Mon, May 8, 2017 at 10:40 AM, Satya Marivada wrote: > Hi, > > Started gett

Re: OutOfMemoryError and Too many open files

2017-05-08 Thread Shawn Heisey
On 5/8/2017 11:40 AM, Satya Marivada wrote: > Started getting below errors/exceptions. I have listed the resolution > inline. Could you please see if I am headed right? > > java.lang.OutOfMemoryError: unable to create new native thread > java.io.IOException: Too many open files I have never had a

Re: OutOfMemoryError does not fire the script

2016-05-27 Thread Pablo Anzorena
Perfect, thank you very much. 2016-05-27 12:44 GMT-03:00 Shawn Heisey : > On 5/27/2016 7:05 AM, Pablo Anzorena wrote: > > I am using solr 5.2.1 in cloud mode. My jvm arguments for the > > OutOfMemoryError is > > -XX:OnOutOfMemoryError='/etc/init.d/solrcloud;restart' > > > > In the Solr UI, the ev

Re: OutOfMemoryError does not fire the script

2016-05-27 Thread Shawn Heisey
On 5/27/2016 7:05 AM, Pablo Anzorena wrote: > I am using solr 5.2.1 in cloud mode. My jvm arguments for the > OutOfMemoryError is > -XX:OnOutOfMemoryError='/etc/init.d/solrcloud;restart' > > In the Solr UI, the event is beign fired, but nothing happens. In all versions before 5.5.1, that -XX param

Re: OutOfMemoryError for PDF document upload into Solr

2015-01-16 Thread Erick Erickson
> On Thu, Jan 15, 2015 at 1:54 PM, wrote: >>> >>> Siegfried and Michael Thank you for your replies and help. >>>> >>>> -Original Message- >>>> From: Siegfried Goeschl [mailto:sgoes...@gmx.at] >>>> Sent: Thursday, January 15

Re: OutOfMemoryError for PDF document upload into Solr

2015-01-16 Thread Jack Krupansky
ael Thank you for your replies and help. >>> >>> -Original Message- >>> From: Siegfried Goeschl [mailto:sgoes...@gmx.at] >>> Sent: Thursday, January 15, 2015 3:45 AM >>> To: solr-user@lucene.apache.org >>> Subject: Re: OutOfMemoryError for P

Re: OutOfMemoryError for PDF document upload into Solr

2015-01-16 Thread Markus Jelsma
of > > getjmp/longjmp. But fast... > > > > On Thu, Jan 15, 2015 at 1:54 PM, wrote: > >> Siegfried and Michael Thank you for your replies and help. > >> > >> -Original Message- > >> From: Siegfried Goeschl [mailto:sgoes...@gmx.at] > &

Re: OutOfMemoryError for PDF document upload into Solr

2015-01-16 Thread Charlie Hull
and Michael Thank you for your replies and help. -Original Message- From: Siegfried Goeschl [mailto:sgoes...@gmx.at] Sent: Thursday, January 15, 2015 3:45 AM To: solr-user@lucene.apache.org Subject: Re: OutOfMemoryError for PDF document upload into Solr Hi Ganesh, you can increase the

Re: OutOfMemoryError for PDF document upload into Solr

2015-01-16 Thread Siegfried Goeschl
and Michael Thank you for your replies and help. -Original Message- From: Siegfried Goeschl [mailto:sgoes...@gmx.at] Sent: Thursday, January 15, 2015 3:45 AM To: solr-user@lucene.apache.org Subject: Re: OutOfMemoryError for PDF document upload into Solr Hi Ganesh, you can increase the

Re: OutOfMemoryError for PDF document upload into Solr

2015-01-15 Thread Dan Davis
ael Thank you for your replies and help. > > -Original Message- > From: Siegfried Goeschl [mailto:sgoes...@gmx.at] > Sent: Thursday, January 15, 2015 3:45 AM > To: solr-user@lucene.apache.org > Subject: Re: OutOfMemoryError for PDF document upload into Solr > > Hi Ganes

RE: OutOfMemoryError for PDF document upload into Solr

2015-01-15 Thread Ganesh.Yadav
Siegfried and Michael Thank you for your replies and help. -Original Message- From: Siegfried Goeschl [mailto:sgoes...@gmx.at] Sent: Thursday, January 15, 2015 3:45 AM To: solr-user@lucene.apache.org Subject: Re: OutOfMemoryError for PDF document upload into Solr Hi Ganesh, you can

Re: OutOfMemoryError for PDF document upload into Solr

2015-01-15 Thread Siegfried Goeschl
Hi Ganesh, you can increase the heap size but parsing a 4 GB PDF document will very likely consume A LOT OF memory - I think you need to check if that large PDF can be parsed at all :-) Cheers, Siegfried Goeschl On 14.01.15 18:04, Michael Della Bitta wrote: Yep, you'll have to increase the

Re: OutOfMemoryError for PDF document upload into Solr

2015-01-14 Thread Michael Della Bitta
Yep, you'll have to increase the heap size for your Tomcat container. http://stackoverflow.com/questions/6897476/tomcat-7-how-to-set-initial-heap-size-correctly Michael Della Bitta Senior Software Engineer o: +1 646 532 3062 appinions inc. “The Science of Influence Marketing” 18 East 41st St

Re: OutOfMemoryError

2014-12-16 Thread Trilok Prithvi
Shawn, looks like the JVM bump did the trick. Thanks! On Tue, Dec 16, 2014 at 10:39 AM, Trilok Prithvi wrote: > > Thanks Shawn. We will increase the JVM to 4GB and see how it performs. > > Alexandre, > Our queries are simple (with strdist() function in almost all the > queries). No facets, or sor

Re: OutOfMemoryError

2014-12-16 Thread Trilok Prithvi
Thanks Shawn. We will increase the JVM to 4GB and see how it performs. Alexandre, Our queries are simple (with strdist() function in almost all the queries). No facets, or sorts. But we do a lot of data loads. We index data a lot (several documents, ranging from 10 - 10 documents) and we uploa

Re: OutOfMemoryError

2014-12-16 Thread Alexandre Rafalovitch
What's your queries look like? Especially FQs, facets, sort, etc. All of those things require caches of various sorts. Regards, Alex. Personal: http://www.outerthoughts.com/ and @arafalov Solr resources and newsletter: http://www.solr-start.com/ and @solrstart Solr popularizers community: https

Re: OutOfMemoryError

2014-12-16 Thread Shawn Heisey
On 12/16/2014 9:55 AM, Trilok Prithvi wrote: We are getting OOME pretty often (every hour or so). We are restarting nodes to keep up with it. Here is our setup: SolrCloud 4.10.2 (2 shards, 2 replicas) with 3 zookeepers. Each node has: 16GB RAM 2GB JVM (Xmx 2048, Xms 1024) ~100 Million documents

Re: OutOfMemoryError while merging large indexes

2014-04-12 Thread Furkan KAMACI
Hi; According to Sun, the error happens "if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown.". Specifying more memory should be helpful. On the other

Re: OutOfMemoryError while merging large indexes

2014-04-08 Thread Haiying Wang
From: François Schiettecatte To: solr-user@lucene.apache.org; Haiying Wang Sent: Tuesday, April 8, 2014 8:25 PM Subject: Re: OutOfMemoryError while merging large indexes Have you tried using:     -XX:-UseGCOverheadLimit François On Apr 8, 2014, at 6:06 PM, Haiying Wang wrote: > Hi, > &

Re: OutOfMemoryError while merging large indexes

2014-04-08 Thread François Schiettecatte
Have you tried using: -XX:-UseGCOverheadLimit François On Apr 8, 2014, at 6:06 PM, Haiying Wang wrote: > Hi, > > We were trying to merge a large index (9GB, 21 million docs) into current > index (only 13MB), using mergeindexes command ofCoreAdminHandler, but always > run into OOM e

Re: OutOfMemoryError in RamUsageEstimator in solr 4.6

2013-12-16 Thread Torben Greulich
Hi Shawn, thanks for your reply. But we don't think that this is really a OOM error, because we already increased the heap to 64gb and the OOM occurs at a usage of 30-40gb. So solr would allocate more than 20gb at once. this sounds a little bit too much. Furthermore we found Lucene45DocValuesProdu

Re: OutOfMemoryError in RamUsageEstimator in solr 4.6

2013-12-16 Thread Shawn Heisey
On 12/16/2013 2:34 AM, Torben Greulich wrote: > we get a OutOfMemoryError in RamUsageEstimator and are a little bit > confused about the error. > We are using solr 4.6 and are confused about the Lucene42DocValuesProducer. > We checked current solr code and found that Lucene42NormsFormat will be > r

Re: OutOfMemoryError

2013-03-27 Thread Arkadi Colson
I upgraded java to version 7 and everything seems to be stable now! BR, Arkadi On 03/25/2013 09:54 PM, Shawn Heisey wrote: On 3/25/2013 1:34 AM, Arkadi Colson wrote: I changed my system memory to 12GB. Solr now gets -Xms2048m -Xmx8192m as parameters. I also added -XX:+UseG1GC to the java proce

Re: OutOfMemoryError

2013-03-26 Thread Shawn Heisey
On 3/25/2013 1:34 AM, Arkadi Colson wrote: I changed my system memory to 12GB. Solr now gets -Xms2048m -Xmx8192m as parameters. I also added -XX:+UseG1GC to the java process. But now the whole machine crashes! Any idea why? Mar 22 20:30:01 solr01-gs kernel: [716098.077809] java invoked oom-kille

Re: OutOfMemoryError

2013-03-25 Thread Otis Gospodnetic
Arkadi, jstat -gcutil -h20 2000 100 also gives useful info about GC and I use it a lot for quick insight into what is going on with GC. SPM (see http://sematext.com/spm/index.html ) may also be worth using. Otis -- Solr & ElasticSearch Support http://sematext.com/ On Mon, Mar 25, 2013 at

Re: OutOfMemoryError

2013-03-25 Thread Bernd Fehling
You can also use "-verbose:gc -XX:+PrintGCDateStamps -XX:+PrintGCDetails -Xloggc:gc.log" as additional options to get a "gc.log" file and see what GC is doing. Regards Bernd Am 25.03.2013 16:01, schrieb Arkadi Colson: > How can I see if GC is actually working? Is it written in the tomcat logs as

Re: OutOfMemoryError

2013-03-25 Thread Arkadi Colson
How can I see if GC is actually working? Is it written in the tomcat logs as well or will I only see it in the memory graphs? BR, Arkadi On 03/25/2013 03:50 PM, Bernd Fehling wrote: We use munin with jmx plugin for monitoring all server and Solr installations. (http://munin-monitoring.org/) On

Re: OutOfMemoryError

2013-03-25 Thread Bernd Fehling
We use munin with jmx plugin for monitoring all server and Solr installations. (http://munin-monitoring.org/) Only for short time monitoring we also use jvisualvm delivered with Java SE JDK. Regards Bernd Am 25.03.2013 14:45, schrieb Arkadi Colson: > Thanks for the info! > I just upgraded java f

Re: OutOfMemoryError

2013-03-25 Thread Arkadi Colson
Thanks for the info! I just upgraded java from 6 to 7... How exactly do you monitor the memory usage and the affect of the garbage collector? On 03/25/2013 01:18 PM, Bernd Fehling wrote: The of UseG1GC yes, but with Solr 4.x, Jetty 8.1.8 and Java HotSpot(TM) 64-Bit Server VM (1.7.0_07). os.​a

Re: OutOfMemoryError

2013-03-25 Thread Bernd Fehling
The of UseG1GC yes, but with Solr 4.x, Jetty 8.1.8 and Java HotSpot(TM) 64-Bit Server VM (1.7.0_07). os.​arch: amd64 os.​name: Linux os.​version: 2.6.32.13-0.5-xen Only args are "-XX:+UseG1GC -Xms16g -Xmx16g". Monitoring shows that 16g is a bit high, I might reduce it to 10g or 12g for the slaves

Re: OutOfMemoryError

2013-03-25 Thread Arkadi Colson
Is sombody using the UseG1GC garbage collector with Solr and Tomcat 7? Any extra options needed? Thanks... On 03/25/2013 08:34 AM, Arkadi Colson wrote: I changed my system memory to 12GB. Solr now gets -Xms2048m -Xmx8192m as parameters. I also added -XX:+UseG1GC to the java process. But now t

Re: OutOfMemoryError

2013-03-25 Thread Arkadi Colson
I changed my system memory to 12GB. Solr now gets -Xms2048m -Xmx8192m as parameters. I also added -XX:+UseG1GC to the java process. But now the whole machine crashes! Any idea why? Mar 22 20:30:01 solr01-gs kernel: [716098.077809] java invoked oom-killer: gfp_mask=0x201da, order=0, oom_adj=0 M

Re: OutOfMemoryError

2013-03-14 Thread Shawn Heisey
On 3/14/2013 3:35 AM, Arkadi Colson wrote: Hi I'm getting this error after a few hours of filling solr with documents. Tomcat is running with -Xms1024m -Xmx4096m. Total memory of host is 12GB. Softcommits are done every second and hard commits every minute. Any idea why this is happening and how

Re: OutOfMemoryError

2013-03-14 Thread Arkadi Colson
On 03/14/2013 03:11 PM, Toke Eskildsen wrote: On Thu, 2013-03-14 at 13:10 +0100, Arkadi Colson wrote: When I shutdown tomcat free -m and top keeps telling me the same values. Almost no free memory... Any idea? Are you reading top & free right? It is standard behaviour for most modern operatin

Re: OutOfMemoryError

2013-03-14 Thread Toke Eskildsen
On Thu, 2013-03-14 at 13:10 +0100, Arkadi Colson wrote: > When I shutdown tomcat free -m and top keeps telling me the same values. > Almost no free memory... > > Any idea? Are you reading top & free right? It is standard behaviour for most modern operating systems to have very little free memory

Re: OutOfMemoryError

2013-03-14 Thread Arkadi Colson
When I shutdown tomcat free -m and top keeps telling me the same values. Almost no free memory... Any idea? On 03/14/2013 10:35 AM, Arkadi Colson wrote: Hi I'm getting this error after a few hours of filling solr with documents. Tomcat is running with -Xms1024m -Xmx4096m. Total memory of hos

Re: OutOfMemoryError | While Faceting Query

2012-12-07 Thread uwe72
You mean this: stats: entries_count : 24 entry#0 : 'NIOFSIndexInput(path="/home/connect/ConnectPORTAL/preview/solr-home/data/index/_2f3.frq")'=>'WiringDiagramSheetImpl.pageNumber',class org.apache.lucene.search.FieldCache$StringIndex,null=>org.apache.lucene.search.FieldCache$StringIndex#32159051

Re: OutOfMemoryError | While Faceting Query

2012-12-07 Thread Bernd Fehling
Hi Uwe, sorting should be well prepared. First rough check is fieldCache. You can see it with SolrAdmin Stats. The "insanity_count" there should be 0 (zero). Only sort on fields which are prepared for sorting and make sense to be sorted. Do only faceting on fields which make sense. I've seen syst

Re: OutOfMemoryError when using query with sort

2011-11-16 Thread Benson Ba
Hi Hamid, i also encounterd the same OOM issue on windows 2003 (32-bits) server... but only 3 millions articles stored in solr. i would like to know your configurations to drive so many records. Many thanks. Best Regards Benson -- View this message in context: http://lucene.472066.n3.nabble

RE: OutOfMemoryError coming from TermVectorsReader

2011-09-23 Thread Anand.Nigam
Nigam RBS Global Banking & Markets Office: +91 124 492 5506 -Original Message- From: Otis Gospodnetic [mailto:otis_gospodne...@yahoo.com] Sent: 23 September 2011 09:35 To: solr-user@lucene.apache.org Subject: Re: OutOfMemoryError coming from TermVectorsReader Anand, But do you re

Re: OutOfMemoryError coming from TermVectorsReader

2011-09-22 Thread Otis Gospodnetic
/ :: Solr - Lucene - Nutch Lucene ecosystem search :: http://search-lucene.com/ - Original Message - > From: "anand.ni...@rbs.com" > To: solr-user@lucene.apache.org > Cc: > Sent: Thursday, September 22, 2011 11:56 PM > Subject: RE: OutOfMemoryError coming from

RE: OutOfMemoryError coming from TermVectorsReader

2011-09-22 Thread Anand.Nigam
bit Solr version : 3.4.0 Thanks & Regards Anand Anand Nigam RBS Global Banking & Markets Office: +91 124 492 5506 -Original Message- From: Glen Newton [mailto:glen.new...@gmail.com] Sent: 19 September 2011 16:52 To: solr-user@lucene.apache.org Subject: Re: OutOfMemoryErro

Re: OutOfMemoryError coming from TermVectorsReader

2011-09-19 Thread Glen Newton
Please include information about your heap size, (and other Java command line arguments) as well a platform OS (version, swap size, etc), Java version, underlying hardware (RAM, etc) for us to better help you. >From the information you have given, increasing your heap size should help. Thanks, Gl

Re: OutOfMemoryError with DIH loading 140.000 documents

2011-05-02 Thread Otis Gospodnetic
Zoltan - Solr is not preventing you from giving your JVM 2GB heap, something else is. If you paste the error we may be able to help. Otis Sematext :: http://sematext.com/ :: Solr - Lucene - Nutch Lucene ecosystem search :: http://search-lucene.com/ - Original Message > From: Zol

Re: OutOfMemoryError with DIH loading 140.000 documents

2011-05-02 Thread Erick Erickson
What do you have your commit parameters set to in solrconfig.xml? I suspect you can make this all work by reducing the ram threshold in the config file Best Erick On Mon, May 2, 2011 at 4:55 AM, Zoltán Altfatter wrote: > Hi, > > I receive OutOfMemoryError with Solr 3.1 when loading around 14

Re: OutOfMemoryError when using query with sort

2010-05-03 Thread Erick Erickson
c per day > > > > > > From: Koji Sekiguchi > To: solr-user@lucene.apache.org > Sent: Sun, May 2, 2010 9:08:42 PM > Subject: Re: OutOfMemoryError when using query with sort > > Hamid Vahedi wrote: > > Hi, i using solr that running on

Re: OutOfMemoryError when using query with sort

2010-05-02 Thread Hamid Vahedi
, 2010 9:08:42 PM Subject: Re: OutOfMemoryError when using query with sort Hamid Vahedi wrote: > Hi, i using solr that running on windows server 2008 32-bit. > I add about 100 million article into solr without set store attribute. (only > store document id) (index file size about 164 GB) &

Re: OutOfMemoryError when using query with sort

2010-05-02 Thread Koji Sekiguchi
Hamid Vahedi wrote: Hi, i using solr that running on windows server 2008 32-bit. I add about 100 million article into solr without set store attribute. (only store document id) (index file size about 164 GB) when try to get query without sort , it's return doc ids in some ms, but when add sor

RE: OutOfMemoryError due to auto-warming

2009-09-24 Thread Francis Yakin
I reduced the size of queryResultCache in solrconfig seems to fix the issue as well. 200 >From 500 500 Francis -Original Message- From: didier deshommes [mailto:dfdes...@gmail.com] Sent: Thursday, September 24, 2009 3:32 PM To: solr-user@lucene.apache.org Cc: Andrew Mont

Re: OutOfMemoryError due to auto-warming

2009-09-24 Thread didier deshommes
On Thu, Sep 24, 2009 at 5:40 PM, Francis Yakin wrote: > You also can increase the JVM HeapSize if you have enough physical memory, > like for example if you have 4GB physical, gives the JVM heapsize 2GB or > 2.5GB. Thanks, we can definitely do that (we have 4GB available). I also forgot to add

RE: OutOfMemoryError due to auto-warming

2009-09-24 Thread Francis Yakin
You also can increase the JVM HeapSize if you have enough physical memory, like for example if you have 4GB physical, gives the JVM heapsize 2GB or 2.5GB. Francis -Original Message- From: didier deshommes [mailto:dfdes...@gmail.com] Sent: Thursday, September 24, 2009 3:32 PM To: solr-u

Re: OutOfMemoryError - Quick Fix: Increase HashDocSet

2008-07-17 Thread Fuad Efendi
Thanks Mike, I have 25 millions docs indexed, faceted on simple fields (cardinality: 5 for country field and 1 for host field) 8192Mb, JRockit R27 (Java 6) Unpredictable OOMs... I set HashDocSet/max to 30,000, don't see any performance degradation yet (the same response times for faceted

Re: OutOfMemoryError - Quick Fix: Increase HashDocSet

2008-07-17 Thread Mike Klaas
On 17-Jul-08, at 10:28 AM, Fuad Efendi wrote: Change it to higher value, for instance, 3. OpenBitSet is created for larger values and requires a lot of memory... Careful--hash sets of that size can be quite slow. It does make sense to bump up the value to 6000 or so for large