Hi, thanks for your advice!

I have deliberately allocated 32G to JVM, with the command "java -Xmx32000m
-jar start.jar" etc. I am using our server which I think has a total of 48G.
However it still crashes because of that error when I specify any keywords
in my query. The only query that worked, as I said, is "q=*:*"

I also realised that the best configuration would be a cloudsetting. It's a
shame that I cannot split this index for that purpose but rather have to
re-index everything. 

But I very much would like to know exactly what has happened with that
error: 

"java.lang.OutOfMemoryError: OutOfMemoryError likely caused by the Sun VM
Bug described in https://issues.apache.org/jira/browse/LUCENE-1566; 
try calling FSDirectory.setReadChunkSize with a value smaller than the
current chunk size (2147483647)"

Especially what does the last line tell me?

Many thanks again!



--
View this message in context: 
http://lucene.472066.n3.nabble.com/170G-index-1-5-billion-documents-out-of-memory-on-query-tp4042696p4042796.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to