Does anyone know it? :sob:
---
[Visit
Topic](https://discuss.tvm.apache.org/t/storage-alignment-info-lost/7289/2) to
respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from these emails, [click
here](https://discuss.tvm.apache.org/email/unsubscribe/c13
I am getting poor performance (in terms of schedule efficiency) of
autoscheduler for simple consecutive subtraction kernel:
```
in = te.placeholder((N, H, W), dtype='float')
out = te.compute((N-1, H, W), lambda n, y, x: in[n+1, y, x] - in[n, y, x])
return [in, out]
```
Example (tir) of sche
Hi there,
I just compiled from source and the output is the same now:
loop = 0
Pytorch : 0.048932
OnnxRuntime : 0.048932
TVM : 0.048932
loop = 1
Pytorch : 0.079360
OnnxRuntime : 0.079360
TVM : 0.079360
---
[Visit
Topic](https://discu
Hi @Wanger-SJTU, right now profiling is not supported over RPC. There are a
couple issue blocking support, but the main one is that the profiling report
cannot be sent over rpc.
---
[Visit
Topic](https://discuss.tvm.apache.org/t/type-array-is-not-supported-by-rpc/10859/2)
to respond.
Yo
I use tvm-rpc apk for rpc connection, and compile the tvm and runtime with
```
set(USE_RPC ON)
set(USE_RELAY_DEBUG ON)
set(USE_GRAPH_EXECUTOR_DEBUG ON)
```
while complie the apk with default setting, coundnt runt with debug, which
showed no `tvm.graph_executor_debug.create` found.
later i adde