My impression is actually opposite to yours. My impression is if you return
Undef or incompatible layout, then layout_transform will be inserted to
guarantee the correctness. For example:
```
conv2d(NCHW, OIHW) -(NCHW)-> transpose(axis=[0, 2, 3, 1]) -(NHWC)->
```
The output layout of `conv2d`
@aakah18151 sorry for the delay. hm, it seems to me like you might be somehow
allocating too much memory for your device in the runtime. you could be
overwriting the stack when you set_input. unfortunately we don't have a good
way to detect this at the moment, though Zephyr should provide you
In TVM, a tuple corresponds to `tvm::runtime::Array` - so I am not sure if it
is well supported in codegen.
---
[Visit
Topic](https://discuss.tvm.apache.org/t/passing-a-tuple-to-a-packed-func/9117/2)
to respond.
You are receiving this because you enabled mailing list mode.
To unsubscrib
Hey @areusch It seems like I am unable to set the lowed model parameters in to
graph_mod, which is why I am getting this error. I am able to run this
sine_model https://tvm.apache.org/docs/tutorials/micro/micro_tflite.html using
similar steps but when passing the lowered parameters from my mo
@eric-haibin-lin @ziheng any thoughts? I saw you were involved in
[PR6079](https://github.com/apache/tvm/pull/6079) which seems to have been the
last modifications to `te.extern`
---
[Visit
Topic](https://discuss.tvm.apache.org/t/tiled-tensors-for-external-python-functions-and-tir-te-ext
Hello everyone,
I was wondering if there is a way to pass a tuple (or array) directly to a
packed func.
```
tpl = tuple(1,2,3)
tvm.tir.call_packed("func_name", tpl)
```
Reading the [packed_func
documentation](https://tvm.apache.org/docs/dev/runtime.html?highlight=packed_func#packedfunc),
I
Hello everyone,
i have been implementing my version of the
[Resampler](https://www.tensorflow.org/addons/api_docs/python/tfa/image/resampler)
OP (from TF Frontend) to our TVM Stack.
Now (to my understanding) by adding the "InferCorrectLayout" Attribute to the
RelayCall Node i should be able