bq:  I changed
<maxWarmingSearchers>*2*</maxWarmingSearchers>
to <maxWarmingSearchers>*100*</maxWarmingSearchers>. And apply simultaneous
searching using 100 workers.

Do not do this. This has nothing to do with the number of searcher
threads. And with
your update rate, especially if you continue to insist on adding
commit=true to every
update request, this will explode your memory requirements. To no good purpose
whatsoever.

bq: But MongoDB can handle concurrent searching and indexing faster.

Because MongoDB is optimized for different kinds of operations. Solr
is a ranking,
free-text search engine. It's an apples-and-oranges comparison. If MongoDB
meets your search needs, you should use it.

Best,
Erick

On Sun, Aug 9, 2015 at 11:04 PM, Nitin Solanki <nitinml...@gmail.com> wrote:
> Hi,
>      I used solr 5.2.1 version. It is fast, I think. But again, I am stuck
> on concurrent searching and threading. I changed
> <maxWarmingSearchers>*2*</maxWarmingSearchers>
> to <maxWarmingSearchers>*100*</maxWarmingSearchers>. And apply simultaneous
> searching using 100 workers. It works fast but not upto the mark.
>
> It increases searching from 1.5  to 0.5 seconds. But If I run only single
> worker then searching time is 0.03 seconds,  it is too fast but not
> possible with 100 workers simultaneously.
>
> As Shawn said - "Making 100 concurrent indexing requests at the same time
> as 100
> concurrent queries will overwhelm *any* single Solr server". I got your
> point.
>
> But MongoDB can handle concurrent searching and indexing faster. Then why
> not solr? Sorry for this..
>
>
>
> On Mon, Aug 10, 2015 at 2:39 AM Shawn Heisey <apa...@elyograg.org> wrote:
>
>> On 8/7/2015 1:15 PM, Nitin Solanki wrote:
>> > I wrote a python script for indexing and using
>> > urllib and urllib2 for indexing data via http..
>>
>> There are a number of Solr python clients.  Using a client makes your
>> code much easier to write and understand.
>>
>> https://wiki.apache.org/solr/SolPython
>>
>> I have no experience with any of these clients, but I can say that the
>> one encountered most often when Python developers come into the #solr
>> IRC channel is pysolr.  Our wiki page says the last update for pysolr
>> happened in December of 2013, but I can see that the last version on
>> their web page is dated 2015-05-26.
>>
>> Making 100 concurrent indexing requests at the same time as 100
>> concurrent queries will overwhelm *any* single Solr server.  In a
>> previous message you said that you have 4 CPU cores.  The load you're
>> trying to put on Solr will require at *LEAST* 200 threads.  It may be
>> more than that.  Any single system is going to have trouble with that.
>> A system with 4 cores will be *very* overloaded.
>>
>> Thanks,
>> Shawn
>>
>>

Reply via email to