Thank you Elaine,

splitted files worked for me too.



2014-05-06 19:15 GMT+02:00 Cario, Elaine <elaine.ca...@wolterskluwer.com>:

> Hi Giovanni,
>
> I had the same issue just last week!  I worked around it temporarily by
> segmenting the file into < 1 MB files, and then using a comma-delimited
> list of files in the filter specification in the schema.
>
> There is a known issue around this:
>
> https://issues.apache.org/jira/browse/SOLR-4793
>
> ...and presumably there is a param you can set in zookeeper and solr
> (jute.maxbuffersize) to override the 1 MB limit.  I didn't have enough time
> to test that out (and its not clear to me what form the value should take),
> at the time it was easier for me to brute force the files.
>
> -----Original Message-----
> From: Giovanni Bricconi [mailto:giovanni.bricc...@banzai.it]
> Sent: Tuesday, May 06, 2014 12:11 PM
> To: solr-user
> Subject: solr cloud 4.8, synonymfilterfactory and big dictionaries
>
> Hello
>
> I am migrating an application to solrcloud and I have to deal with a big
> dictionary, about 10Mb
>
> It seems that I can't upload it to zookeper, is there a way of specifying
> an external file for the synonyms parameter?
>
> can I compress the file or split it in many small files?
>
> I have the same problem for SnowballPorterFilterFactory
>
> Thanks
>

Reply via email to