I'm not sure about executor class.

However, this method might work on your requirement too. (never run it, but 
from my guess it should work too)

```
mod, params = relay.frontend.from_onnx(onnx_model, shape_dict)
func = mod["main"]

target = "llvm"
with tvm.transform.PassContext(opt_level=3):
    lib = relay.build(func, target, params=params)

dev = tvm.cpu(0)
m = graph_executor.GraphModule(lib["default"](dev))

# set input (you can set many input as you want, but you need to know input 
node name)
m.set_input("data", tvm.nd.array(x.astype(dtype)))

# execute
m.run()

# get outputs
tvm_output = m.get_output(0)
```





---
[Visit 
Topic](https://discuss.tvm.apache.org/t/is-there-a-demo-to-run-onnx-model-with-multiple-input-nodes/10366/4)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.apache.org/email/unsubscribe/01a5c704cf62c7d976eebad56cc26c12dffe68c474065f85cbbe2443adf89f14).

Reply via email to