That is why people don't use search engines to manage logs. Look at a Hadoop cluster.

wunder

On Aug 5, 2009, at 10:08 PM, Silent Surfer wrote:


Hi,

That means we need approximately 3000 GB (Index Size)/24 GB (RAM) = 125 servers.

It would be very hard to convince my org to go for 125 servers for log management of 3 Terabytes of indexes.

Has any one used, solr for processing and handling of the indexes of the order of 3 TB ? If so how many servers were used for indexing alone.

Thanks,
sS


--- On Wed, 8/5/09, Ian Connor <ian.con...@gmail.com> wrote:

From: Ian Connor <ian.con...@gmail.com>
Subject: Re: Limit of Index size per machine..
To: solr-user@lucene.apache.org
Date: Wednesday, August 5, 2009, 9:38 PM
I try to keep the index directory
size less than the amount of RAM and rely
on the OS to cache as it needs. Linux does a pretty good
job here and I am
sure OS X will do a good job also.

Distributed search here will be your friend so you can
chunk it up to a
number of servers to keep your cost down (2GB RAM sticks
are much cheaper
than 4GB RAM sticks $20 < $100).

Ian.

On Wed, Aug 5, 2009 at 1:44 PM, Silent Surfer <silentsurfe...@yahoo.com >wrote:


Hi ,

We are planning to use Solr for indexing the server
log contents.
The expected processed log file size per day: 100 GB
We are expecting to retain these indexes for 30 days
(100*30 ~ 3 TB).

Can any one provide what would be the optimal size of
the index that I can
store on a single server, without hampering the search
performance etc.

We are planning to use OSX server with a configuration
of 16 GB (Can go to
24 GB).

We need to figure out how many servers are required to
handle such amount
of data..

Any help would be greatly appreciated.

Thanks
SilentSurfer







--
Regards,

Ian Connor
1 Leighton St #723
Cambridge, MA 02141
Call Center Phone: +1 (714) 239 3875 (24 hrs)
Fax: +1(770) 818 5697
Skype: ian.connor






Reply via email to