After checking `__tvm_module_ctx ` in libtest_wasm32.a, it occurs to me that 
there is a mismatch between `lib.save` in [build_test_lib.py line 
35](https://github.com/apache/incubator-tvm/blob/master/rust/runtime/tests/test_wasm32/src/build_test_lib.py#L35)
 and 
[.cargo/config](https://github.com/apache/incubator-tvm/blob/master/rust/runtime/tests/test_wasm32/.cargo/config)
 in the official repo. When I switch the save target to `wasm32-wasi` 
everything works fine while executing in wasmtime runtime.

Regarding to my goal, I guess building the test case is only the first step. To 
successfully deploy model on browser, the possible option is to utilize runtime 
like 
[@wasmer-wasi](https://docs.wasmer.io/integrations/js/reference-api/wasmer-wasi)
 to interact between JS and wasm. On the other hand, building a web app with a 
pipeline simular to webgpu app seems promising as well.   @leonwanghui I would 
like to know if there is any comment or preference from your opinion?





---
[Visit 
Topic](https://discuss.tvm.ai/t/how-to-build-runnable-wasm-on-browser/7048/5) 
to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.ai/email/unsubscribe/0ec203dfb67652213e726bbfa448db97e8ed06b317313d40cde529c30f69dfea).

Reply via email to