I'd be very surprised if this were to work. I recall one situation in
which 24 facets in a request placed too much pressure on the server.

In order to support faceting, Solr maintains a cache of the faceted
field. You need one cache for each field you are faceting on, meaning
your memory requirements will be substantial, unless, I guess, your
fields are sparse. Also, during a faceting request, the server must do a
scan across each of those fields, and that will take time, and with tat
many fields, I'd imagine quite a bit of time.

Upayavira

On Mon, Mar 18, 2013, at 07:34 AM, sivaprasad wrote:
> Hi,
> 
> We have configured solr for 5000 facet fields as part of request
> handler.We
> have 10811177 docs in the index.
> 
> The solr server machine is quad core with 12 gb of RAM.
> 
> When we are querying with facets, we are getting out of memory error.
> 
> What we observed is , If we have larger number of facets we need to have
> larger RAM allocated for JVM. In this case we need to scale up the system
> as
> and when we add more facets.
> 
> To scale out the system, do we need to go with distributed search?
> 
> Any thoughts on this helps me to handle this situation.
> 
> Thanks,
> Siva
> 
> 
> 
> 
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/Facets-with-5000-facet-fields-tp4048450.html
> Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to