On 10/01/2021 17:50, tom ehlert wrote:
there is simply no DOS application needing even 100 MB.
making  more than 4 GB available won't change this.

applications needing more then 4GB would probably benefit more from
multiple cores.


You may not know it, but I still use DOS to this day in an industrial setting. At our company we have a DOS program that actually controls hundreds of remote machines with tens of embedded devices in each machine in real time.

It's an old program for sure, much modified and amended, but it started as a windows program but converted early on to DOS since it was simply much easier to do all this in DOS. When you are talking to processors running at 2MHz that must interpret and respond to complex communication protocols without sacrificing the job they really need to do you have protocols with very precise and tight turnaround times and timeouts. Yes I could write drivers for every windows version that comes out (and only a driver could handle this) but windows (and even Linux) give your drivers access when they want and sometimes that is not enough.

With DOS on a modern machine I can handle dozens of these communications in real time with nano-second turnaround times and the program doesn't even hiccup.

However, to improve speed I also keep a lot of data in memory. One core with unlimited RAM will run circles around multiple cores with not enough RAM, and at one client I am already getting close to 2GB of data that I want instant access to.

Andreas



_______________________________________________
Freedos-devel mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/freedos-devel

Reply via email to