I try to keep the index directory size less than the amount of RAM and rely
on the OS to cache as it needs. Linux does a pretty good job here and I am
sure OS X will do a good job also.

Distributed search here will be your friend so you can chunk it up to a
number of servers to keep your cost down (2GB RAM sticks are much cheaper
than 4GB RAM sticks $20 < $100).

Ian.

On Wed, Aug 5, 2009 at 1:44 PM, Silent Surfer <silentsurfe...@yahoo.com>wrote:

>
> Hi ,
>
> We are planning to use Solr for indexing the server log contents.
> The expected processed log file size per day: 100 GB
> We are expecting to retain these indexes for 30 days (100*30 ~ 3 TB).
>
> Can any one provide what would be the optimal size of the index that I can
> store on a single server, without hampering the search performance etc.
>
> We are planning to use OSX server with a configuration of 16 GB (Can go to
> 24 GB).
>
> We need to figure out how many servers are required to handle such amount
> of data..
>
> Any help would be greatly appreciated.
>
> Thanks
> SilentSurfer
>
>
>
>
>


-- 
Regards,

Ian Connor
1 Leighton St #723
Cambridge, MA 02141
Call Center Phone: +1 (714) 239 3875 (24 hrs)
Fax: +1(770) 818 5697
Skype: ian.connor

Reply via email to