[Apache TVM Discuss] [Questions] Frontend.from_onnx causes error: Expected Array[IntImm], but got relay.Var

2021-07-07 Thread Sahooora via Apache TVM Discuss
Hi everybody, I have an onnx model that I want to import into tvm. For the following snippet code I've got the error: ... shape_dict = {'input_16': (1,128,3)} onnx_model = onnx.load('./mymodel.onnx') sym, params = relay.frontend.from_onnx(onnx_model,shape_dict) Error: Check fa

[Apache TVM Discuss] [Questions] Runnig a model with tvm debugger

2021-05-03 Thread Sahooora via Apache TVM Discuss
Thank you so much for your response! I saw the PR is merged to the master branch now so I've cloned the latest version but I still have the same issue! --- [Visit Topic](https://discuss.tvm.apache.org/t/runnig-a-model-with-tvm-debugger/9869/3) to respond. You are receiving this because y

[Apache TVM Discuss] [Questions] Runnig a model with tvm debugger

2021-04-30 Thread Sahooora via Apache TVM Discuss
Hi everybody, I used tvm debugger based on the description came [here](https://tvm.apache.org/docs/dev/debugger.html#how-to-use-debugger). The following line gives me 3 files: output_tensors.params, executaion_trace.json, and graph_dump.json. I get different output_tensors.params file for ev