Hi everybody,
I have an onnx model that I want to import into tvm. For the following snippet
code I've got the error:
...
shape_dict = {'input_16': (1,128,3)}
onnx_model = onnx.load('./mymodel.onnx')
sym, params = relay.frontend.from_onnx(onnx_model,shape_dict)
Error: Check fa
Thank you so much for your response!
I saw the PR is merged to the master branch now so I've cloned the latest
version but I still have the same issue!
---
[Visit
Topic](https://discuss.tvm.apache.org/t/runnig-a-model-with-tvm-debugger/9869/3)
to respond.
You are receiving this because y
Hi everybody,
I used tvm debugger based on the description came
[here](https://tvm.apache.org/docs/dev/debugger.html#how-to-use-debugger).
The following line gives me 3 files: output_tensors.params,
executaion_trace.json, and graph_dump.json.
I get different output_tensors.params file for ev