Thanks for the tips Erick, i'm really talking about 2.5GB files full of data to
be indexed. Like .csv files or .xls, .ods and so on. I guess I will try to do a
great increase on the memory the JVM will be able to use.
Regards,
Augusto
>>> Erick Erickson 1/27/2012 1:22 pm >>>
Hmmm, I'd go c
Hmmm, I'd go considerably higher than 2.5G. Problem is you the Tika
processing will need memory, I have no idea how much. Then you'll
have a bunch of stuff for Solr to index it etc.
But I also suspect that this will be about useless to index (assuming
you're talking lots of data, not say just the
I'm talking about 2 GB files. It means that I'll have to allocate something
bigger than that for the JVM? Something like 2,5 GB?
Thanks,
Augusto Camarotti
>>> Erick Erickson 1/25/2012 1:48 pm >>>
Mostly it depends on your container settings, quite often that's
where the limits are. I don't t
Mostly it depends on your container settings, quite often that's
where the limits are. I don't think Solr imposes any restrictions.
What size are we talking about anyway? There are implicit
issues with how much memory parsing the file requires, but you
can allocate lots of memory to the JVM to han