Did the test while back . Revisiting this again. But in standalone solr we have experienced the queries more time if the data exists in 2 shards . That's the main reason this test was done. If anyone has experience want to hear
On Tue, Jun 30, 2020 at 11:50 PM Jörn Franke <jornfra...@gmail.com> wrote: > How many documents ? > The real difference was only a couple of ms? > > > Am 01.07.2020 um 07:34 schrieb Raji N <rajis...@gmail.com>: > > > > Had 2 indexes in 2 separate shards in one collection and had exact same > > data published with composite router with a prefix. Disabled all caches. > > Issued the same query which is a small query with q parameter and fq > > parameter . Number of queries which got executed (with same threads and > > run for same time ) were more in 2 indexes with 2 separate shards case. > > 90th percentile response time was also few ms better. > > > > Thanks, > > Raji > > > >> On Tue, Jun 30, 2020 at 10:06 PM Jörn Franke <jornfra...@gmail.com> > wrote: > >> > >> What did you test? Which queries? What were the exact results in terms > of > >> time ? > >> > >>>> Am 30.06.2020 um 22:47 schrieb Raji N <rajis...@gmail.com>: > >>> > >>> Hi , > >>> > >>> > >>> Trying to place multiple smaller indexes in one collection (as we read > >>> solrcloud performance degrades as number of collections increase). We > are > >>> exploring two ways > >>> > >>> > >>> 1) Placing each index on a single shard of a collection > >>> > >>> In this case placing documents for a single index is manual and > >>> automatic rebalancing not done by solr > >>> > >>> > >>> 2) Solr routing composite router with a prefix . > >>> > >>> In this case solr doesn’t place all the docs with same prefix in > one > >>> shard , so searches becomes distributed. But shard rebalancing is taken > >>> care by solr. > >>> > >>> > >>> We did a small perf test with both these set up. We saw the performance > >> for > >>> the first case (placing an index explicitly on a shard ) is better. > >>> > >>> > >>> Has anyone done anything similar. Can you please share your experience. > >>> > >>> > >>> Thanks, > >>> > >>> Raji > >> >