Anyone from the Solr team who can shed some more light?

On Tue, Jun 12, 2018 at 8:13 PM, Ratnadeep Rakshit <ratnad...@qedrix.com>
wrote:

> I observed that the build works if the data size is below 25M. The moment
> the records go beyond that, this OOM error shows up. Solar itself shows 56%
> usage of 20GB space during the build. So, is there some settings I need to
> change to handle larger data size?
>
> On Tue, Jun 12, 2018 at 3:17 PM, Alessandro Benedetti <
> a.benede...@sease.io> wrote:
>
>> Hi,
>> first of all the two different suggesters you are using are based on
>> different data structures ( with different memory utilisation) :
>>
>> - FuzzyLookupFactory -> FST ( in memory and stored binary on disk)
>> - AnalyzingInfixLookupFactory -> Auxiliary Lucene Index
>>
>> Both the data structures should be very memory efficient ( both in
>> building
>> and storage).
>> What is the cardinality of the fields you are building suggestions from ?
>> (
>> site_address and site_address_other)
>> What is the memory situation in Solr when you start the suggester
>> building ?
>> You are allocating much more memory to the JVM Solr process than the OS (
>> which in your situation doesn't fit the entire index ideal scenario).
>>
>> I would recommend to put some monitoring in place ( there are plenty of
>> open
>> source tools to do that)
>>
>> Regards
>>
>>
>>
>> -----
>> ---------------
>> Alessandro Benedetti
>> Search Consultant, R&D Software Engineer, Director
>> Sease Ltd. - www.sease.io
>> --
>> Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
>>
>
>

Reply via email to