On Fri, Jul 20, 2007 at 10:43:10AM -0700, Gilad Shainer wrote:

> You had previously compared Mellanox based products on dual single-core
> machines to the "InfiniPath" adapter on dual dual-core machines and
> claim that with InfiniPath there are more Gflops.... This latest release
> follow the same lines...

Strong words from a guy whose only published performance whitepaper
compares Fluent performance using very different versions of
Fluent. (No, they aren't comparable.)

If you're referring to one of my old whitpapers, if I compared
single-core to dual I made it clear, and there was no direct data to
compare to. As you are aware, there's very little published
performance data for Mellanox adaptors, but I'm sure you'll work hard
at making such info available. MPI2007 is a great opportunity to
compare performance: I encourage you to give it your best shot.

Also, your use of "you" is overly wide, let me give you a guide for
telling the difference betweeen Kevin Ball and Greg Lindahl:

Kevin                           Greg

diplomatic                      rude
physicist                       astronomer
short hair                      long hair
ballroom dance                  renaisssance dance
thin                            middle-age spread
tall                            claims he's "average height"
too young to be cynical         "you kids! when I was YOUR age..."

> Unlike QLogic InfiniPath adapters, Mellanox provide different InfiniBand
> HCA silicon and adapters.

Er, there are 2 kinds of InfiniPath silicon, thanks for noticing.

I'll be looking forward to seeing lots of MPI2007 benchmark results from
you, Gilad.

-- greg
(used to work for PathScale, apparently still vehement about it. :-)
_______________________________________________
Beowulf mailing list, Beowulf@beowulf.org
To change your subscription (digest mode or unsubscribe) visit 
http://www.beowulf.org/mailman/listinfo/beowulf

Reply via email to