> From: Sean Godsell <sgods...@gmail.com> > Date: Wed, 27 Nov 2024 17:00:27 -0500 > > I was wondering if anyone has any plans to make the actual 'make' command > work across multiple > connected PC systems, via networking of some kind. It could be wireless > networking, ethernet, or even > networking through thunderbolt, usb 4, or even fiber. All that matters is > that each networked PC has access > to the same files. > > For example if you want multiple PC's compiling the linux kernel source code > for example, then each PC > needs to see the same Linux kernel files, and directory structure. The main > system compiler, or build server > PC that has all of the kernel source code, would also need to have something > like an NFS server > configured, and running on that main build PC as well. That way each > connected PC, will be able to help > out with compiling the source code as well. Just as long as each PC has > access to the exact same files via > an NFS client, which needs to be setup as well. To speed things up even > more, you could make sure all of > the build programs are installed on each client PC as well, like gcc, g++, > as, ar, ld, ... > > I was thinking of using open MPI for which would send each build instruction > to any available open slot via > MPI. > > Has anyone attempted anything like this before? If so, where is that > information for make? Any help on this > subject would be greatly appreciated. Thanks in advance.
This is already supported, and has been since long ago. You just need to implement a few functions, see remote-stub.c. One such implementation, for the Customs daemon, is already available, see remote-cstms.c.