On Wed, Apr 18, 2007 at 03:16:39PM +0530, Siju George wrote: > Hi, > > How Do you handle when you have to Serve terrabytes of Data through > http/https/ftp etc?
It depends :-) > Put it on Differrent machines and use some knid of > loadbalancer/intelligent program that directs to the right mahine? > It depends :-) > use some kind of clustering Software? > It depends :-) > Waht hardware do you use to make your System Scalable from a few > terrabytes of Data to a few hundred of them? > It depends :-) > Does Debian have any clustering Software Packages? > Yes. I know that the RedHat cluster tools have been packaged for Debian. There might be others as well. Basically, what you do depends on the following: - how many users will access the data? - is the data static or server-side code of some kind? - what kind of load does it produce? - how capable is the machine (or machines) you currently have? - are the users accessing this via a fast local network? - fast WAN? slow WAN? - how reliable must the service be? - what are your cost and other constraints? You might also want to ask your question on the debian-isp list. By the looks of it, you have a fairly haevy-duty requirement. There would more likely be people on that list who have experienve with larger setups like the one it looks you are headed toward. Regards, -Roberto -- Roberto C. Sánchez http://people.connexer.com/~roberto http://www.connexer.com
signature.asc
Description: Digital signature