Im curious though not to change the topic here

Mark you mention dual socket. With what amd have to offer with the epic rome 2 
series you can do more with less actually. I am just curious as to everyones 
thoughts on them for a cluster?

Regards,
Jonathan Aquilina

EagleEyeT
Phone +356 20330099
Sales – sa...@eagleeyet.net<mailto:sa...@eagleeyet.net>
Support – supp...@eagleeyet.net

From: Beowulf <beowulf-boun...@beowulf.org> On Behalf Of Lux, Jim (US 337K) via 
Beowulf
Sent: Sunday, 2 February 2020 22:09
To: beowulf@beowulf.org
Subject: Re: [Beowulf] [EXTERNAL] Re: First cluster in 20 years - questions 
about today

How old is the old cluster?  You might actually spend more time trying to get 
the nodes all working than you’d save by having them as a compute element.  
I’ve looked at piles of computers in my garage and thought “hey, I should 
cluster them” and then, I realize that the discount laptop I can buy for a few 
hundred bucks will blow the combination away.

So, unless you’re “learning how to build a cluster”, I wouldn’t think that’s 
the way to go.  And for the “how to bring up a cluster tinkering”, a batch of 
rPi or beagles and a cheap switch is probably cheaper and more reflective of 
modern distros.

OTOH, if your three old nodes are a year old, then have at it.


From: Beowulf <beowulf-boun...@beowulf.org<mailto:beowulf-boun...@beowulf.org>> 
on behalf of "jaquil...@eagleeyet.net<mailto:jaquil...@eagleeyet.net>" 
<jaquil...@eagleeyet.net<mailto:jaquil...@eagleeyet.net>>
Date: Saturday, February 1, 2020 at 10:45 PM
To: Mark Kosmowski <mark.kosmow...@gmail.com<mailto:mark.kosmow...@gmail.com>>, 
"beowulf@beowulf.org<mailto:beowulf@beowulf.org>" 
<beowulf@beowulf.org<mailto:beowulf@beowulf.org>>
Subject: [EXTERNAL] Re: [Beowulf] First cluster in 20 years - questions about 
today

Hi Mark,

So you are going to revive your old 3 node cluster and expand that? I would 
suggest if you are looking to expand the cluster I would look at the ryzen epyc 
rome 2 the second generation of these chips is quite impressive in the sense 
you can do double the amount of 2 single intel chips. They vary from 8 core 16 
threads up to 64 core 128 threads. Also you don’t have all these issues that 
intel are currently facing with 7nm process as well as the vulnerabilities. I 
just moved my gaming pc from a 6th gen i7 to a ryzen 5 3600 6 core 12 thread 
machine and im seeing a huge difference in performance.


Regards,
Jonathan Aquilina

EagleEyeT
Phone +356 20330099
Sales – sa...@eagleeyet.net<mailto:sa...@eagleeyet.net>
Support – supp...@eagleeyet.net<mailto:supp...@eagleeyet.net>

___________________________________________________________________________________________

From: Beowulf <beowulf-boun...@beowulf.org<mailto:beowulf-boun...@beowulf.org>> 
On Behalf Of Mark Kosmowski
Sent: Sunday, 2 February 2020 04:21
To: beowulf@beowulf.org<mailto:beowulf@beowulf.org>
Subject: [Beowulf] First cluster in 20 years - questions about today

I've been out of computation for about 20 years since my master degree.  I'm 
getting into the game again as a private individual.  When I was active Opteron 
was just launched - I was an early adopter of amd64 because I needed the RAM 
(maybe more accurately I needed to thoroughly thrash my swap drives).  I never 
needed any cluster management software with my 3 node, dual socket, single core 
little baby Beowulf.  (My planned domain is computational chemistry and I'm 
hoping to get to a point where I can do ab initio catalyst surface reaction 
modeling of small molecules (not biomolecules).)

I'm planning to add a few nodes and it will end up being fairly heterogenous.  
My initial plan is to add two or three multi-socket, multi-core nodes as well 
as a 48 port gigabit switch.  How should I assess whether to have one big 
heterogenous cluster vs. two smaller quasi-homogenous clusters?

Will it be worthwhile to learn a cluster management software?  If so, 
suggestions?

Should I consider Solaris or illumos?  I do plan on using ZFS, especially for 
the data node, but I want as much redundancy as I can get, since I'm going to 
be using used hardware.  Will the fancy Solaris cluster tools be useful?

Also, once I get running, while I'm getting current with theory and software 
may I inquire here about taking on a small, low priority academic project to 
make sure the cluster side is working good?

Thank you all for still being here!
_______________________________________________
Beowulf mailing list, Beowulf@beowulf.org sponsored by Penguin Computing
To change your subscription (digest mode or unsubscribe) visit 
https://beowulf.org/cgi-bin/mailman/listinfo/beowulf

Reply via email to