Nicole,

According to our findings, there is also a limit for the number of shards
depending on the volume of the returned data. See this jira:

https://issues.apache.org/jira/browse/SOLR-4903

Dmitry


On Thu, Jul 25, 2013 at 11:25 AM, Nicole Lacoste <niki.laco...@gmail.com>wrote:

> Oh found the answer myself.  Its the GET methods URL length that limits the
> number of shards.
>
> Niki
>
>
> On 25 July 2013 10:14, Nicole Lacoste <niki.laco...@gmail.com> wrote:
>
> > Is there a limit on the number of shards?
> >
> > Niki
> >
> >
> > On 24 July 2013 01:14, Jack Krupansky <j...@basetechnology.com> wrote:
> >
> >> 2.1 billion documents (including deleted documents) per Lucene index,
> but
> >> essentially per Solr shard as well.
> >>
> >> But don’t even think about going that high. In fact, don't plan on going
> >> above 100 million unless you do a proof of concept that validates that
> you
> >> get acceptable query and update performance . There is no hard limit
> >> besides that 2.1 billion Lucene limit, but... performance will vary.
> >>
> >> -- Jack Krupansky
> >>
> >> -----Original Message----- From: Ali, Saqib
> >> Sent: Tuesday, July 23, 2013 6:18 PM
> >> To: solr-user@lucene.apache.org
> >> Subject: maximum number of documents per shard?
> >>
> >> still 2.1 billion documents?
> >>
> >
> >
> >
> > --
> > * <https://twitter.com/#!/niki_in_france>*
> >
>
>
>
> --
> * <https://twitter.com/#!/niki_in_france>*
>

Reply via email to