On 11/23/20 10:33 pm, Tony Brian Albers wrote:
What they said:
Thank you all for your kind words on and off list, really appreciated!
My next task is to invite back to the list those who got kicked off when
our previous hosting lost its reverse DNS records and various sites
started rejectin
What they said:
|
|
V
Prentice Bisbal via Beowulf wrote:
> I third those kudos. This list has expanded my knowledge of HPC bo
> orders of magnitude, and has provided plenty of entertainment, too!
>
> Keep up the good work, and know that it's appreciated!
>
> Prentice
>
> On 11/23/
Hi Guys,
I am just wondering what advantages does setting up of a cluster have in
relation to big data analytics vs using something like Hadoop/spark?
Regards,
Jonathan
___
Beowulf mailing list, Beowulf@beowulf.org sponsored by Penguin Computing
To ch
I third those kudos. This list has expanded my knowledge of HPC bo
orders of magnitude, and has provided plenty of entertainment, too!
Keep up the good work, and know that it's appreciated!
Prentice
On 11/23/20 9:15 AM, Michael Di Domenico wrote:
I second a kudos on the hard work. This list
Sounds like MPI isn't supported very well now, and perhaps it never was.
In each application the remote jobs all run the same script but they
receive slightly different parameters which are read from the
command line. That was easy with PVM since each remote
job start had its own parameter
Appreciate you moving the list! Importantly, the transition looks to be
flawless :)
Cheers!
Lyle
--
On Sun, 22 Nov 2020 18:10:26 -0800
Chris Samuel wrote:
> Hi all,
>
> Today I've moved the Beowulf VM across to its new home at Rimuhosting
> (an NZ based hosting company, using their Dallas DC).
After looking for ic.pl and not finding it, I clicked on it. You see it in
hypertext. Again found nothing. Everyone seems to agree the code is a
decade old. Perhaps it is an api call to nothing and should be ignored.
Can you find the test program or compile without it? It may be non
existant.
Jonat
I second a kudos on the hard work. This list has been an invaluable
source over the many years of my career, I'd hate to see it
disappear...
On Mon, Nov 23, 2020 at 2:28 AM Simon Cross wrote:
>
> Thanks Chris!
> ___
> Beowulf mailing list, Beowulf@beow
if you're referencing this module
https://metacpan.org/changes/distribution/Parallel-MPI-Simple
which hasn't been updated since 2011, i would suspect there's
something wrong in the perl<->mpi library exchange that will prevent
this from working.
i have a vague recollection of trying this way bac