> > > Initially, we are deciding to use Gigabit ehternet switch and 1GB of
> > >RAM at
> > >each node.
>
> that seems like an odd choice. it's not much ram, and gigabit is
> extremely slow (relative to alternatives, or in comparison to on-board
> memory access.)
This is a common misconceptio
I sort of agree :)
We've seen dual-core do very well on most CFD applications. For instance
switching to dual-core on Fluent only results in about a 5% loss of performance.
On other CFD codes the difference is on the noise. So I would recommend
dual-core CPUs.
I echo Michaels' comments about 1 G
Dual cpu single core, opteron. Make sure that the 1G of RAM are enough
for your application.
Also consider a low latency interconnect, i.e. infiniband because I have
seen cases where
CFD exchanges a lot of small messages.
Michael Will
--
SE Technical Lead
Penguin Computing
-Original Messag
As i'm not a complete layman in fluid dynamics
software i'll make a few assumption to advice a choice:
assumptions:
a) that gigabit is enough for you and that you
don't need bandwidth to other nodes
b) that you are interested in having a huge amount
of RAM at each compute nodes
c) that pric
Some disdain using SSH for this purpose at all, but if you want to use SSH
for whatever reason, google for ssh hostbased site:liniac.upenn.edu. I can't
speak to that being *necessary* for MPICH to run. I'm sure you could just
setup hostbased rsh (using /etc/hosts.equiv) and be done with it, altho
> > 1. One processor at each of the compute nodes
> > 2. Two processors (on one mother board) at each of the compute nodes
> > 3. Two Processors (each one dual-core processor) (total 4 cores on
> > 4. four processor (on one mother board) at each of the compute nodes.
not considering a 4x2
In message from "amjad ali" <[EMAIL PROTECTED]> (Thu, 15 Jun 2006
04:02:12 -0400):
Hi ALL
We are going to build a true Beowulf cluster for Numerical Simulation
of
Computational Fluid Dynamics (CFD) models at our university. My
question is
that what is the best choice for us out of the followin
Hi ALL
We are going to build a true Beowulf
cluster for Numerical Simulation of Computational Fluid Dynamics
(CFD) models at our university. My question is that what is the best
choice for us out of the following choices about processors for a
given fixed/specific amount/budget:
One processor at
Vincent:
The HPC market is a far cry from the game console market. Having served
my time in both publishing and Marketing (in past lives) I can
understand your point.
But, I don't believe that the point is completely relevant. For example,
even if Sun offered me their e2900 for the same pr
We will have some real benchmarks announced over the next few months.
Microbenchmarks, industry benchmarks, and application benchmarks. I am
not going to throw out some numbers right here because I don't have all
the details yet and some of the driver stacks are still being tuned. But
our testing s
Chris,
I may have talked to that friend. The real issue is that the earlier
compiler worked then the later one didn't. The latest version does work
however.
I do not believe that anyone broke anything to do with the software. I
have compiled G03 on a variety of platforms. This case is the fi
11 matches
Mail list logo