directhex opened a new pull request, #231:
URL: https://github.com/apache/tvm-ffi/pull/231

   This is... largely a follow-up to 
https://github.com/flashinfer-ai/flashinfer/pull/1801 introduced by the new 
dependency.
   
   It is fairly common in niche scenarios - e.g. anywhere involving embedded 
development or cross-compilation - for compiler options to be more than 
decorative. For example, I sped up our (QEMU-based) CI by 80% by using an x64 
build of clang, inside a riscv64 build root, so Python would think "I am doing 
riscv64 native builds" but CXX would point to the x64 clang++ binary. Works 
great, but has a hard dependency on passing values for `--sysroot` and 
`--target`.
   
   Since adding the tvm-ffi dependency, this has been broken in flashinfer, as 
`import tvm_ffi` shells out to the dlpack build script without passing any 
load-bearing compiler flags.
   
   So... add support for those.
   
   Before:
   
   ```
   >>> import tvm_ffi
   Traceback (most recent call last):
     File 
"/tmp/tvm-ffi/python/tvm_ffi/utils/_build_optional_torch_c_dlpack.py", line 
835, in <module>
       main()
     File 
"/tmp/tvm-ffi/python/tvm_ffi/utils/_build_optional_torch_c_dlpack.py", line 
828, in main
       build_ninja(build_dir=str(build_dir))
     File "/tmp/tvm-ffi/python/tvm_ffi/cpp/extension.py", line 353, in 
build_ninja
       raise RuntimeError("\n".join(msg))
   RuntimeError: ninja exited with status 1
   stdout:
   [1/2] /x64-prefix/bin/clang++ -MMD -MF main.o.d -std=c++17 -fPIC -O2 
-DBUILD_WITH_CUDA -D_GLIBCXX_USE_CXX11_ABI=1 
-I/tmp/tvm-ffi/python/tvm_ffi/../../3rdparty/dlpack/include 
-I/usr/include/python3.12 
-I/tmp/flashinfer/.venv/lib/python3.12/site-packages/torch/include 
-I/tmp/flashinfer/.venv/lib/python3.12/site-packages/torch/include/torch/csrc/api/include
 -I/prefix/include -c /tmp/tvm-ffi-torch-c-dlpack-ei5_rkaa/addon.cc -o main.o
   FAILED: [code=1] main.o
   /x64-prefix/bin/clang++ -MMD -MF main.o.d -std=c++17 -fPIC -O2 
-DBUILD_WITH_CUDA -D_GLIBCXX_USE_CXX11_ABI=1 
-I/tmp/tvm-ffi/python/tvm_ffi/../../3rdparty/dlpack/include 
-I/usr/include/python3.12 
-I/tmp/flashinfer/.venv/lib/python3.12/site-packages/torch/include 
-I/tmp/flashinfer/.venv/lib/python3.12/site-packages/torch/include/torch/csrc/api/include
 -I/prefix/include -c /tmp/tvm-ffi-torch-c-dlpack-ei5_rkaa/addon.cc -o main.o
   In file included from /tmp/tvm-ffi-torch-c-dlpack-ei5_rkaa/addon.cc:2:
   In file included from 
/tmp/tvm-ffi/python/tvm_ffi/../../3rdparty/dlpack/include/dlpack/dlpack.h:35:
   In file included from /x64-prefix/lib/clang/21/include/stdint.h:56:
   /usr/include/stdint.h:26:10: fatal error: 'bits/libc-header-start.h' file 
not found
      26 | #include <bits/libc-header-start.h>
         |          ^~~~~~~~~~~~~~~~~~~~~~~~~~
   1 error generated.
   ninja: build stopped: subcommand failed.
   ```
   
   After:
   
   ```
   # export DLPACK_EXTRA_CFLAGS="--target=riscv64-unknown-linux-gnu --sysroot=/"
   # export DLPACK_EXTRA_LDFLAGS="--target=riscv64-linux-gnu --sysroot=/ 
-L/prefix/lib"
   >>> import tvm_ffi
   >>>
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to