juj wrote:

> it's a really annoying problem that real-world code takes advantage of laxness

In Unity's case, when in our millions of lines of code some line of code does a 
bad cast, I find it has not been out of intent to rely on x86 casting laxness, 
but rather an oversight, or, even more common, a memory corruption when a 
random memory location is interpreted as a function pointer.

But because the existing diagnostics flags provided by LLVM are absolutely 
useless to diagnose this issue, and Chrome browser debugger is unable to 
produce informative error messages in these cast situations, we find fixing 
these problems to be notoriously difficult to resolve, because we can't begin 
to tell if it was a random memory corruption-based signature mismatch, or a 
problem by developer mistake.

In our case these issues would be trivial to fix, if:
a) LLVM had a diagnostic flag to narrow down casts to specifically only the 
offending ones that don't work in WebAssembly, and
b) Chrome debugger was able to print out "we expected this signature, but we 
got this signature instead", when attempting to invoke a badly casted function 
pointer occurs.

I.e. so far as I've seen, in the Unity codebase it has never been a situation 
of "lazy code intentionally relied on x86 ABI casting behavior." (the majority 
of devs don't seem to even know what those are)

https://github.com/llvm/llvm-project/pull/153168
_______________________________________________
cfe-commits mailing list
[email protected]
https://lists.llvm.org/cgi-bin/mailman/listinfo/cfe-commits

Reply via email to