In the article: "What if a high level programing description language was developed. Note I did not say programming language. This description language would allow you to “describe” what you needed to do and not how to do it (as discussed before)."
I would ask then, how does one "describe what you need to do"? As a brief (and heinously simplistic) example, let us say that we wanted to replace a specific string in a long text file with another string. In assembler, this would be a heinous and explicit long process where we tell it exactly what we are doing. One could even argue at the C level we have a fair amount of control, but once we hit fairly high-level programming languages one merely can say text.replaceAll(oldstring,newstring); and you are done. You have told the program what you want done, not how to do it. Would I call Java, C#, etc. "Programming Description Languages"? No. Therefore I wouldn't call an even higher level HPC language a description language either. In the article: "This draft description would then be presented to an AI based clarifier which would examine the description, look for inconsistencies or missing information and work with the programmer to create a formal description of the problem." Sounds like regular programming in an intolerant IDE with fancy terminology. In the article: "At that point the description is turned over to a really smart compiler that could target a particular hardware platform and produce the needed optimized binaries. Perhaps a GA could be thrown in to help optimize everything." Later on it is also mentioned that "Maybe it would take a week to create a binary, but it would be cluster time and not your time", where in reality with those really troublesome (useful) problems there are truly terribly long running times. With a GA (which produces eons more bad solutions than good) we would not only have to ascertain the fitness of the really nice solution (for those useful problems it could take a week or more at fastest) but also the fitness of the really really poor solution that swaps out constantly and computes redundantly. That could take years... The basic premise of the GA for code is Genetic Programming or an Evolutionary Algorithm, and so with these the same problems exist - bad solutions that monopolize time on the cluster. Compilers will eventually be entirely AI (though I doubt I will see it) and when they are, singularity will have already happened and infinite resources will be available since designing hardware is naturally more space constrained than software. All I'm saying is for right now, we are making the most of what we have without involving AI that extensively in our programming. Just my opinions, and no hard feelings towards Doug. Typically I enjoy thoroughly his articles. Ellis _______________________________________________ Beowulf mailing list, Beowulf@beowulf.org To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf