Passwordless SSH works between all nodes.
Firewalls are disabled.
From: g...@r-hpc.com [mailto:g...@r-hpc.com] On Behalf Of Greg Keller
Sent: Wednesday, September 19, 2012 8:43 PM
To: beowulf@beowulf.org; Antti Korhonen
Subject: Re: [Beowulf] Cannot use more than two nodes on cluster
I am going
I am going to bet $0.25 that SSH or TCP/IP is configured to allow the
master to get to the nodes without a password, but not from one Compute to
the other Compute.
Test by sshing to Compute1, then from Compute1 to Compute2. Depending on
how you built the cluster, it's also possible there is iptab
Hello
I have a small Beowulf cluster (master and 3 slaves).
I can run jobs on any single nodes.
Running on two nodes sort of works, running jobs on master and 1 slave works.
(all combos, master+slave1 or master+slave2 or master+slave3)
Running jobs on two slaves hangs.
Running jobs on master + any
I've understood that while Phi is the product line, MIC will still remain the
generic name for the architecture family (similar to x86).
To add to the acronym-soup there is also a separate naming scheme for the
binary architecture: If you look at newer binutils packages, the current
(Knight's C
I taught a MPI class a few times and wanted something simple, fun, and
could be improved upon several times as the students learned MPI. It's
obviously embarrassingly parallel, but non-trivial to do well. There's
often not enough work per pixel or per image to make the communications
overhead lo
Bringing up an excellent question for "learning to cluster" activities..
What would be a good sample program to try. There was (is?) a MPI version of
PoVRAY, as I recall. It was nice because it's showy and you can easily see if
you're getting a speedup.
Computing pi isn't very dramatic, especi
nounced Phee
>
> Anyway off to program by Raspberry Pee (!)
>
> Daniel
> Bull Information Systems
> -- next part --
> An HTML attachment was scrubbed...
> URL:
> http://www.beowulf.org/pipermail/beowulf/attachments/20120919/7106a627/attachment-0001.html
___
Beowulf mailing list, Beowulf@beowulf.org sponsored by Penguin Computing
To change your subscription (digest mode or unsubscribe) visit
http://www.beowulf.org/mailman/listinfo/beowulf
> On 09/19/2012 11:02 AM, Prentice Bisbal wrote:
>> On 09/18/2012 10:46 PM, Christopher Samuel wrote:
>>> -BEGIN PGP SIGNED MESSAGE-
>>> Hash: SHA1
>>>
>>> On 19/09/12 12:04, Vincent Diepeveen wrote:
>>>
Maybe do it wireless at around a 2800Mhz frequency instead of using
cable
On 19 September 2012 15:09, Chris Dagdigian wrote:
> Daniel Kidger wrote:
>
>> The technology was also known as MIC : pronounced 'Mick' or 'Mike'
>> depending on who you spoke to.
>> That was confusing - so with PHI it is now unambiguous right? er no - I
>> hear people say both 'Fee' and 'Fi'
>>
On 09/19/2012 11:02 AM, Prentice Bisbal wrote:
> On 09/18/2012 10:46 PM, Christopher Samuel wrote:
>> -BEGIN PGP SIGNED MESSAGE-
>> Hash: SHA1
>>
>> On 19/09/12 12:04, Vincent Diepeveen wrote:
>>
>>> Maybe do it wireless at around a 2800Mhz frequency instead of using
>>>cables. Then sta
On 09/18/2012 10:46 PM, Christopher Samuel wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
> On 19/09/12 12:04, Vincent Diepeveen wrote:
>
>> Maybe do it wireless at around a 2800Mhz frequency instead of using
>> cables. Then stack them all up in a 2nd hand microwave and invent
>> the
Daniel Kidger wrote:
> The technology was also known as MIC : pronounced 'Mick' or 'Mike'
> depending on who you spoke to.
> That was confusing - so with PHI it is now unambiguous right? er no -
> I hear people say both 'Fee' and 'Fi'
>
FWIW - last week at Intel IDF2012 in San Francisco every sin
Anyone running that switch in production? Any issues under load?
___
Beowulf mailing list, Beowulf@beowulf.org sponsored by Penguin Computing
To change your subscription (digest mode or unsubscribe) visit
http://www.beowulf.org/mailman/listinfo/beowulf
On Tue, Sep 18, 2012 at 10:10 AM, Daniel Kidger wrote:
> I touched on the Gromacs port to ClearSpeed when I worked there - I then
> went on to write the port of AMBER to CS
> plus I have a pair of RPis that I tinker with.
I'm not quite sure what the interest is... GROMACS is quite famous for
havi
14 matches
Mail list logo