ie.
Read 'Hot, Flat, and Crowded'
Laugh at http://www.yert.com/film.php
--- On Fri, 7/16/10, marship wrote:
> From: marship
> Subject: Re:Re: How to speed up solr search speed
> To: solr-user@lucene.apache.org
> Date: Friday, July 16, 2010, 11:26 AM
> Hi. Peter.
>
I don't know of a way to tell Solr to load all the indexes into
memory, but if you were to simply read all the files at the OS level,
that would do it. Under a unix OS, "cat * > /dev/null" would work. Under
Windows, I can't think of a way to do it off the top of my head, but if
you had Cygwin
Hi. Shawn.
My indexes are smaller than yours. I only store "id" + "type" in indexes so
each "core" index is about 1 - 1.5GB on disk.
I don't have so many servers/VPS as you have. In my option, my problem is not
CPU. If possible, I prefer to add more memory to fit indexes in my server. At
least a
On 7/17/2010 3:28 AM, marship wrote:
Hi. Peter and All.
I merged my indexes today. Now each index stores 10M document. Now I only have
10 solr cores.
And I used
java -Xmx1g -jar -server start.jar
to start the jetty server.
How big are the indexes on each of those cores? You can easily get th
Hi. Geert-Jan.
Thanks for replying.
I know solr has querycache and it improves the search speed from second
time. Actually when I talk about the search speed. I don't mean talking about
the speed of cache. When user search on our site, I don't want the first time
cost 10s and all following
>My query string is always simple like "design", "principle of design",
"tom"
>EG:
>URL:
http://localhost:7550/solr/select/?q=design&version=2.2&start=0&rows=10&indent=on
IMO, indeed with these types of simple searches caching (and thus RAM usage)
can not be fully exploited, i.e: there isn't reall
Hi. Peter and All.
I merged my indexes today. Now each index stores 10M document. Now I only have
10 solr cores.
And I used
java -Xmx1g -jar -server start.jar
to start the jetty server.
At first I deployed them all on one search. The search speed is about 3s. Then
I noticed from cmd output wh
> > Each solr(jetty) instance on consume 40M-60M memory.
> java -Xmx1024M -jar start.jar
That's a good suggestion!
Please, double check that you are using the -server version of the jvm
and the latest 1.6.0_20 or so.
Additionally you can start jvisualvm (shipped with the jdk) and hook
into jetty
to both. One dies
and it falls back to the other.
Dennis Gearon
Signature Warning
EARTH has a Right To Life,
otherwise we all die.
Read 'Hot, Flat, and Crowded'
Laugh at http://www.yert.com/film.php
--- On Fri, 7/16/10, marship wrote:
> From: marship
> Subj
you mentioned that you have a lot of mem free, but your yetty containers
only using between 40-60 mem.
probably stating the obvious, but have you increased the -Xmx param like for
instance:
java -Xmx1024M -jar start.jar
that way you're configuring the container to use a maximum of 1024 MB ram
ins
Hi Tom Burton-West.
Sorry looks my email ISP filtered out your replies. I checked web version of
mailing list and saw your reply.
My query string is always simple like "design", "principle of design", "tom"
EG:
URL:
http://localhost:7550/solr/select/?q=design&version=2.2&start=0&rows=1
t;>> The problem is even if I put 2M documents into each core. Then I
>>>> have only 36 cores at the moment. But when our documents doubles in
>>>> the future, same issue will rise again. So I don't think save 1M in
>>>> each core is the issue.
>&g
to improve solr search
>>> speed from some other way.
>>> Any suggestion?
>>>
>>> Regards.
>>> Scott
>>>
>>>
>>>
>>>
>>>
>>> 在2010-07-15 15:24:08,"Fornoville, Tom"
>>> 写道:
>>>
?
>>
>> Regards.
>> Scott
>>
>>
>>
>>
>>
>> 在2010-07-15 15:24:08,"Fornoville, Tom" 写道:
>>
>>> Is there any reason why you have to limit each instance to only 1M
>>> documents?
>>> If you could put more
y.
>>Any suggestion?
>>
>> Regards.
>> Scott
>>
>>
>>
>>
>>
>> 在2010-07-15 15:24:08,"Fornoville, Tom" 写道:
>>
>>> Is there any reason why you have to limit each instance to only 1M
>>> documents?
&g
>> Is there any reason why you have to limit each instance to only 1M
>> documents?
>> If you could put more documents in the same core I think it would
>> dramatically improve your response times.
>>
>> -Original Message-
>> From: marship [mailto:mars...@126.
>Sent: donderdag 15 juli 2010 6:23
>To: solr-user
>Subject: How to speed up solr search speed
>
>Hi. All.
>I got a problem with distributed solr search. The issue is
>I have 76M documents spread over 76 solr instances, each instance
>handles 1M documents.
> Prev
-user
Subject: How to speed up solr search speed
Hi. All.
I got a problem with distributed solr search. The issue is
I have 76M documents spread over 76 solr instances, each instance
handles 1M documents.
Previously I put all 76 instances on single server and when I tested
I found each
Hi. All.
I got a problem with distributed solr search. The issue is
I have 76M documents spread over 76 solr instances, each instance handles
1M documents.
Previously I put all 76 instances on single server and when I tested I found
each time it runs, it will take several times, most
19 matches
Mail list logo