What relay expands to is memory copy.  I want to avoid that.  I want to have a 
copy-less representation in TIR.

This should really be a no-op, but ends up copying everything.

```
import tensorflow as tf
import tvm
import tvm.relay

g = tf.Graph()
with g.as_default():
  u = tf.unstack(tf.placeholder(dtype = 'int32', shape = (100,100)))
  s = tf.stack(u)

  m, p = 
tvm.relay.frontend.tensorflow.from_tensorflow(g.as_graph_def(add_shapes=True))
  g, m, p = tvm.relay.build_module.build(m, target='llvm')
  m.save('pack.ll')
```





---
[Visit Topic](https://discuss.tvm.ai/t/tensor-arrays-in-tir/7135/4) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.ai/email/unsubscribe/cb179268bb0b5b981ce694be3d323fd469d1c336583bbe196c2f4e6d9424e5e4).

Reply via email to