http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59672

--- Comment #1 from Andrew Pinski <pinskia at gcc dot gnu.org> ---
Note GCC does not even support real 16bit code for x86.  So pretending GCC's
output is 16bit code is a joke.

Why can't you just write the 16bit binary support in assembly for the kernel?

Reply via email to