On 14 Apr 2013, at 10:01, Jonathan Dursi wrote:

> I cannot agree with this piece highly enough.  
> 
> Widespread cloud availability, GPU, etc, has enabled all sorts of weird, 
> wacky, and _useful_ large-scale technical computing use cases, and arguing 
> about whether new use case X is "really" HPC has long since lost whatever 
> novelty it had.  I'm pleased to see Jeff Layton using the broader term 
> "Research Computing"; in my corner of the world I've been pushing for the 
> term Advanced R&D Computing (ARC) as a catch all for any sort of 
> technical/numerical computing that requires you to do something "special" 
> (e.g., do something different than run naive serial code on a desktop).   
> Someone else can probably come up with a better name, but I actually think 
> that holding on to terms with existing pretty strong connotations is hurting 
> more than helping at this point.

I've taken to saying I work in "Computing" as a distinct field from "IT".  The 
difference being that Computing is about using computers for 
calculations/analytics rather than as a tool for doing something else.

Ashley,
_______________________________________________
Beowulf mailing list, Beowulf@beowulf.org sponsored by Penguin Computing
To change your subscription (digest mode or unsubscribe) visit 
http://www.beowulf.org/mailman/listinfo/beowulf

Reply via email to