I was [looking for something like this a couple of months 
back](https://discuss.tvm.apache.org/t/reshape-in-place-using-relay/6856), but 
to avail.

It would be useful to have, I'm just unsure what changes would be needed.  In a 
sense we have in-place operations when we fuse conv2d+relu layers (afaik), 
since we apply the ReLU on the accumulated value when it is ready.

Doing this requires a specialised pass (though I haven't read the code for it). 
 One could in-principle do something similar with your use-case.  But it's more 
interesting to consider what a general solution would look like, that could be 
easily used at the Python `te.compute` expression level.





---
[Visit 
Topic](https://discuss.tvm.apache.org/t/supporting-in-place-operations/7871/2) 
to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.apache.org/email/unsubscribe/75ae421b5d9f70a48ff5ad65d3da98af6763255933242c4e1706ec2eead42e15).

Reply via email to