This unfortunately cannot be controlled by the auto-scheduler. One good feature 
of auto-scheduler is that it schedules an entire fused compute graph together. 
On the other hand, it implies that we don't differentiate ops once they are 
fused into a single TE compute. Specifically, the name of the TE tensors, such 
as `PadInputs`, `T_reshape`, and `compute`, are defined in the TOPI. 

Reference:

https://github.com/apache/tvm/blob/813136401a11a49d6c15e6013c34dd822a5c4ff6/python/tvm/topi/nn/pad.py#L26

https://github.com/apache/tvm/blob/5ad2f77403bed9a2bf356cc0d3d785ecc13e6c58/include/tvm/topi/transform.h#L235

However, for your particular case, `T_reshape` seems come from the same op:

```
T_reshape_ax0, T_reshape_ax1, T_reshape_ax2, T_reshape_ax3 = \
  tuple(T_reshape.op.axis) + tuple(T_reshape.op.reduce_axis)

T_reshape_ax0, T_reshape_ax1, T_reshape_ax2, T_reshape_ax3 = \
  tuple(T_reshape.op.axis) + tuple(T_reshape.op.reduce_axis)

T_reshape_ax0, T_reshape_ax1, T_reshape_ax2, T_reshape_ax3 = \
  tuple(T_reshape.op.axis) + tuple(T_reshape.op.reduce_axis)
```

So IIUC, this is just the case that auto-scheduler printer wants to make sure 
the variable (e.g., `T_reshape_ax0`) it will be referring to is the right one, 
so it re-generates them using the same reshape op.





---
[Visit 
Topic](https://discuss.tvm.apache.org/t/print-auto-schedule-python-schedule-with-topi-op/11363/2)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.apache.org/email/unsubscribe/a43795c6b8879507de2a606bbc9c64618cabe88735541f43be167efb3ec68e49).

Reply via email to