On 2/25/2015 5:21 AM, Moshe Recanati wrote:
> We checked this option and it didn't solve our problem.
> We're using https://github.com/healthonnet/hon-lucene-synonyms for query 
> based synonyms.
> While running query with high number of words that have high number of 
> synonyms the query got stuck and solr memory is exhausted.
> We tried to use this parameter suggested by you however it didn't stop the 
> query and solve the issue.
> 
> Please let me know if there is other option to tackle it. Today it might be 
> high number of words that cause the issue and tomorrow it might be other 
> something wrong. We can't rely only on user input check.

If legitimate queries use a lot of memory, you'll either need to
increase the java heap so it can deal with the increased memory
requirements, or you'll have to take steps to decrease memory usage.

Those steps might include changes to your application code to detect
problematic queries before they happen, and/or educating your users
about how to properly use the search.

Lucene and Solr are constantly making advances in memory efficiency, so
making sure you're always on the latest version goes a long way towards
keeping Solr efficient.

Thanks,
Shawn

Reply via email to