Does Relay have an FFT operator?
---
[Visit Topic](https://discuss.tvm.ai/t/fft-operator-in-relay-ir/7040/1) to
respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from these emails, [click
here](https://discuss.tvm.ai/email/unsubscribe/a589f42dfe22e126b
Hi, graph tuning is a graph(relay) level optimization. Usually you need to
create a relay op which wraps your te computation to use graph tuning.
---
[Visit
Topic](https://discuss.tvm.ai/t/graph-tuning-a-te-computation-te-op-to-relay-expr/7037/2)
to respond.
You are receiving this becaus
Is it possible to graph tune a te based computation?
Graph tuners expect a relay.Expr, but so far I could not find a way to convert
te based expressions to relay expression.
Whenever I use te-based constructs (tensors and ops) I found them to be
incompatible with relay constructs, is it possi
have solved the issue?
---
[Visit
Topic](https://discuss.tvm.ai/t/load-a-tvm-model-from-a-float16-onnx-model-error/6855/2)
to respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from these emails, [click
here](https://discuss.tvm.ai/email/unsubscribe/81
Yeah, it all wants to be static static to operate on.
But so what I'm after is the next step, eliminate all ops not needed in a
static setting.
This seems important for anything where the graph is created automatic - with
the frontend converters as well as differentiation.
Best regards
Thomas