Really hard to say. Are you saying that you'll have on the
order of 200M documents? What kinds of searches
are you expecting to do? Sorting? Faceting? Really, the
only way to know is to measure. You'll probably have to
load up a single machine on successively larger data
sets until you overload the machine and extrapolate.

This may be a good place to start:
http://wiki.apache.org/solr/SolrPerformanceFactors

<http://wiki.apache.org/solr/SolrPerformanceFactors>FWIW
Erick

On Wed, Sep 22, 2010 at 1:55 PM, Vaibhav Shrivastava <
vaibhav.s.mn...@gmail.com> wrote:

> Hi,
>    I am a new user on this group. I wished to use Solr for deploying an
> Index on a 2Tb amount of data of documents. I wished to know if someone
> could help me out in estimating the number of machines required for serving
> this index, assuming I shall use Amazon machine instances for this.
>
> The details regarding the system are as follows:
>
> 1. No of fields to be indexed : 3 or 5.
> 2. The size of fields should be relatively small. The corresponding
> document
> size could be around 5-10 kb.
> 3. Index should be updated on a daily basis.
>
> --
> Vaibhav Shrivastava,
> Graduate Student,
> MS Computer Science,
> Stony Brook University.
>

Reply via email to