Thank you for this proposal! This work does make scheduling much easier. I have 
a concern about using this way to write a tensor expression. It looks like more 
complicated than tvm.compute when defining matmul. We need to define some 
buffers and creating block with corresponding shape dimension. It would be 
helpful if you can add a conv2d example which can replace existing 
topi.nn.conv2d definition to better understand what developer would need to 
write.

Another question is about representing generic programming style ops such as 
shape functions. Since these programs don't fit into tvm scheduling, I assume 
it would still be more convenient to use existing te hybrid script to create 
these ops?





---
[Visit 
Topic](https://discuss.tvm.apache.org/t/rfc-tensorir-a-schedulable-ir-for-tvm/7872/9)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.apache.org/email/unsubscribe/368fa3a2cf5e4531886a799e6f4ed9355e153e65b0c9853c9a0d477a7b7dbe8e).

Reply via email to