Memory leak for Multiple fields faceted searching

2011-03-21 Thread newsam
Hi, Here is my environment: 32bits windows server 2008 (4cpu and 12G RAM), 320million docs, index size 12G. When we use query=*:* and 10 fields for faceted searching, it works. However, 15 feilds or more faceted searching caused the following exception: java.lang.OutOfMemoryError: Java heap sp

How to use different solrconfig.xml for nodes in SolrCloud?

2010-10-03 Thread newsam
Hi all, I am trying to set up a two-shards solr cluster with dev cloud branch (http://wiki.apache.org/solr/SolrCloud). We'd like to use different solrconfig.xml for the two shards. The main difference is the replication information. However, the cluster model will use a single solrconfig.xml an

Re: Why the query performance is so different for queries?

2010-09-29 Thread newsam
cene.apache.org >To: solr-user@lucene.apache.org, newsam >Subject: Re: Why the query performance is so different for queries? >Date: Wed, 29 Sep 2010 20:13:20 -0700 > >How much ram does the JVM have? > >Wildcard queries are slow. Starting with '*' are even slower. If

Why the query performance is so different for queries?

2010-09-28 Thread newsam
Hi guys, I have posted a thread "The search response time is too long". The SOLR searcher instance is deployed with Tomcat 5.5.21. . The index file is 8.2G. The doc num is 6110745. DELL Server has Intel(R) Xeon(TM) CPU (4 cores) 3.00GHZ and 6G RAM. In SOLR back-end, "query=key:*" costs alm

Re: Re:The search response time is too loong

2010-09-28 Thread newsam
Thx. I will let you know the latest status. >From: Lance Norskog >Reply-To: solr-user@lucene.apache.org >To: solr-user@lucene.apache.org, newsam >Subject: Re: Re:The search response time is too loong >Date: Tue, 28 Sep 2010 13:34:53 -0700 > >Copy the index. Delete half of th

Re: Re:The search response time is too loong

2010-09-28 Thread newsam
I guess you are correct. We used the default SOLR cache configuration. I will change the cache configuration. BTW, I want to deploy several shards from the existing 8G index file, such as 4G per shards. Is there any tool to generate two shards from one 8G index file? >From: kenf_nc >Reply-To:

Re:The search response time is too loong

2010-09-27 Thread newsam
We used SOLR 1.4. All queries were excuted in SOLR back-end. I guess that I/O operations consume the time too much. >From: "newsam" >Reply-To: solr-user@lucene.apache.org"newsam" >To: solr-user@lucene.apache.org >Subject: Re:The search response time is too loo

The search response time is too loong

2010-09-27 Thread newsam
I have setup a SOLR searcher instance with Tomcat 5.5.21. However, the response time is too long. Here is my scenario: 1. The index file is 8.2G. The doc num is 6110745. 2. DELL Server: Intel(R) Xeon(TM) CPU (4 cores) 3.00GHZ, 6G Mem. I used "Key:*" to query all records by localhost:8080. The res