Mike Furr writes: > [Please CC, I'm not subscribed] > Hello gcc team, > > One of my packages fails to build on m68k due to excessive memory use by > g++(~373mb) and thus has RC-bug 207578. This occurs on all archs, but > m68k is the only arch without the memory to handle it. I recently > discovered[0] that the g++ in gcc-snapshot uses only a fraction of the > memory on x86 and thus assume that the same is true on m68k. I would > very much like to close this bug and get the current version in unstable > into testing. > > To do this, I could either wait for gcc-3.4 to enter unstable, or just > create a binary on m68k with gcc-snapsot. Note that the package has > almost no hope of actually being run on m68k since it requires a > 3d-accelerated video card. Thus I don't see any harm in using > gcc-snapshot just on that arch. However, if a new gcc release is in the > near future, I wouldn't mind waiting just a little bit longer for that. > > So, do you have any general idea of the upcoming release schedule? If > its not 'soon', would be it okay to go ahead and use gcc-snapshot to > build a package on that arch? Or do you have any other > thoughts/suggestions of what I could do?
using g++ from the gcc-snapshot package on m68k means to change the exception model from sjlj based to dwarf2 based exceptions, so if you depend on another shared C++ library, this one should be compiled with gcc-snapshot as well. And most likely you will depend on shared libs in gcc-snapshot (libgcc, libstdc++), which won't move to testing. So your packages keeps stuck in unstable. Even if gcc-3.4 gets released probably in March/April, gcc-3.3 will stay the default compiler until sarge is released and it's unsure, if gcc-3.4 will make it sarge as an optional compiler. maybe just ask debian-release to remove the m68k binaries from testing, and mark the report as important, if the package doesn't make sense to use on m68k. Matthias