Steve Herborn wrote:

As far as "Desktop" machines go there hasn't been an application invented that needs more. Because memory & disk storage prices fell

Hmmmm.....

programmers got sloppy & crammed in a lot more, but little to none of it was actually an application that truly needed more because of purpose, only poor design.

Methinks that this is a broad generalization, and if you don't mind a little pun ... broad generalizations tend to be incorrect.

We don't need more than 640k ram ... right? :o

We know of (and have worked with) many applications that have required tremendous memory footprint. One that required hundreds of GB of ram in the late 90s might use a bit more today.

Nothing to do with poor design at all, but very large problems, and very memory intensive algorithms. 3D FFTs still require the actual data for these FFTs. And if you are finely sampling the configuration/momentum spaces for these FFTs, then, well, you sort of don't have a choice about this.

Other codes, such as GAMESS, with coupled cluster calculations have *huge* numbers of matrix elements to compute and store. I doubt calling them poorly designed is a valid criticism.

There are many other examples, but the point being that there are, in fact, perfectly valid needs for very large memory, and very large/very fast I/O systems.

--
Joseph Landman, Ph.D
Founder and CEO
Scalable Informatics LLC,
email: [email protected]
web  : http://www.scalableinformatics.com
       http://jackrabbit.scalableinformatics.com
phone: +1 734 786 8423 x121
fax  : +1 866 888 3112
cell : +1 734 612 4615
_______________________________________________
Beowulf mailing list, [email protected] sponsored by Penguin Computing
To change your subscription (digest mode or unsubscribe) visit 
http://www.beowulf.org/mailman/listinfo/beowulf

Reply via email to