If we can get `Any`(https://github.com/dmlc/tvm/issues/3042) merged, I think we 
can support TensorArray as follows:
````
type dynamic_tensor =
    Tensor0 of TensorType(shape=())
  | Tensor1 of TensorType(shape=(Any))
  | Tensor2 of TensorType(shape=(Any, Any))
  | Tensor3 of TensorType(shape=(Any, Any, Any))
  | Tensor4 of TensorType(shape=(Any, Any, Any, Any))
  | Tensor5 of TensorType(shape=(Any, Any, Any, Any, Any))
  | Tensor6 of TensorType(shape=(Any, Any, Any, Any, Any, Any))

type tensor_array = dynamic_tensor list
````

We define an data type `dynamic_tensor` that supports tensors up to 6(we can 
grow the rank of cause but might not be necessary). Then tensor array is just a 
`dynamic_tensor` list.

Then we can implement TensorArray ops as relay functions. Most of them are 
trivial to implement.  Some are tricky( but I think doable with `expand_dims `):
* TensorArrayConcat
* TensorArrayStack
* TensorArrayUnstack





---
[Visit Topic](https://discuss.tvm.ai/t/how-to-support-tf-tensorarray/1983/3) to 
respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.ai/email/unsubscribe/8beced698fa80aa02c6a32ce0152013fd43ad6ade309efab5e1614c9f02df4c9).

Tianqi Chen, UW, Seattle, WA, 98105, United States
http://tracking.discuss.tvm.ai/tracking/unsubscribe?msgid=mKqzfOWjn5EzGWA5fP_2dA2

Reply via email to