On 2015-10-16 19:52, Kurt Pattyn wrote: > On 17 Oct 2015, at 01:18, Marc Mutz wrote: >> I find it intolerable that todays software takes 1000x the memory, >> 1000x the CPU capacity and doesn't get a given jobs done in less >> wallclock time than software 20 years ago. > > These are 'numbers'. Is it like that? Really 1000x the memory, 1000x the CPU > capacity?
Er... yes? An 80286 of yesteryear ran as fast as 25 MHz¹. A modern, 6-core 4 GHz CPU has a "theoretical" 24 GHz of "computing power". That's almost exactly 1000x (more, considering the ISA improvements). The same machine could address up to 16 MiB of RAM. A lot of machines around my office have 16 GiB of RAM (my home desktop has more; OTOH, many home machines have less). That's pedantically a factor of 1024x; again, close enough. Basically, yeah... computers some 10-20 years ago had similar specs as modern computers - CPU speed in the single to low double digits, memory in similar ranges - except the numbers were in one 1000/1024-order of magnitude less units (MHz vs. GHz, MiB vs. GiB). The increase in disk space is, if anything, even more extreme. There remains, of course, *some* exaggeration there... modern computers are, after all, tackling quantities of data that computers 20 years ago couldn't dream of touching (consider any process run on large clusters, for example). Even so, it's sobering to think about, especially as a lot of user software is "prettier" but in many ways not actually *better*, at least not in proportion to the increase in resource requirements. (¹ https://en.wikipedia.org/wiki/Intel_80286) -- Matthew _______________________________________________ Development mailing list Development@qt-project.org http://lists.qt-project.org/mailman/listinfo/development