I'm curious if someone might be able to guide me as to how I might be able to 
implement a Relay equivalent to the Keras [`TimeDistributed` 
layer](https://keras.io/api/layers/recurrent_layers/time_distributed/). I need 
to import a Keras model that features it and I am willing to extend the Keras 
importer to support this.

Would it be correct to say that the TVM equivalent would split a tensor on the 
second dimension, apply the passed layer to each of the resulting tensors, and 
then concatenate them back together? If so, which operator should I use to 
implement the splitting? I don't have much experience with Keras and, as I 
said, I would be very glad to PR support for `TimeDistributed` to the Keras 
importer.





---
[Visit 
Topic](https://discuss.tvm.apache.org/t/what-relay-op-would-allow-for-splitting-a-tensor-along-an-axis/8478/1)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.apache.org/email/unsubscribe/646562559420e44b6ca6061eae63175e26f2e20bd346b832f2d11d1bf87bbc92).
  • [Apache TVM Discuss] [Questio... Steven S. Lyubomirsky via Apache TVM Discuss

Reply via email to