I could not find a way to use `te.Tensor`s with relay ops or expressions. What
would then be the intended way to wrap `te` constructs in a relay op?
---
[Visit
Topic](https://discuss.tvm.ai/t/graph-tuning-a-te-computation-te-op-to-relay-expr/7037/4)
to respond.
You are receiving this bec
Thanks for the answer Kevin. What's the best way of doing this?
---
[Visit
Topic](https://discuss.tvm.ai/t/graph-tuning-a-te-computation-te-op-to-relay-expr/7037/3)
to respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from these emails, [click
here](h
Is it possible to graph tune a te based computation?
Graph tuners expect a relay.Expr, but so far I could not find a way to convert
te based expressions to relay expression.
Whenever I use te-based constructs (tensors and ops) I found them to be
incompatible with relay constructs, is it possi
@zhiics done! It's [here](https://github.com/apache/incubator-tvm/pull/5259).
---
[Visit
Topic](https://discuss.tvm.ai/t/custom-pass-is-not-working-from-tutorial/5549/7)
to respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from these emails, [click
he
Works!
Also needed to change the multiplier constant type to `float32` from `float` as
the latter was auto-inferred to `float64` on my system, which caused a type
error:
custom_pass = CustomPipeline(multiplier=relay.const(3, "float32"))
---
[Visit
Topic](https://discuss.tvm.ai/t/cust
Experiencing the same problem here. The tutorial on [implementing a pass using
python
decorators](https://docs.tvm.ai/tutorials/dev/relay_pass_infra.html#implement-a-pass-using-python-decorator)
does not seem to work. While `transform_function` is being called, its
wrapping `visit_const` is n