On 2/4/2014 9:49 PM, Sathya wrote:
> Yes all the instances are reading the same 8GB data at a time. The java
> search programs(> 15 instances) are running in different machines, different
> JVM and they accessing the solr server machine(Ubuntu 64 bit). And the solr
> Index is not shard. The query rates  are too poor(more than 5 seconds per
> search in single instance). 

How many instances of Solr itself (not your client application that does
searches) are you running?  How much java heap is allocated to Solr?  If
you don't know, open the admin UI dashboard and look at the "JVM-Memory"
bar graph.  The number at the far right is the one you need.  What else
is running on the same box and competing for that 24GB of RAM?  The
answers to those questions will determine what to look at next.

http://wiki.apache.org/solr/SolrPerformanceProblems

If I read your first message right, "7lac" means that you have 700000
documents.  The lac or lakh is not a common unit of measurement outside
of South Asia, which means that a lot of people on this list have no
idea what it is.  In order to get an 8GB index out of 700000 documents,
each one must be quite large, or you've done something that amplifies
the data, like aggressive tokenization or a LOT of copyField entries.

Thanks,
Shawn

Reply via email to