[TVM Discuss] [Development/RFC] Performing Relay Passes Non-Recursively

2020-03-22 Thread Marisa Kirisame via TVM Discuss
hmm. after a bit of thinking, doesnt this pass force visit to all children, even though that dont necessarily need it? will making the right hand side a lazy value make more sense? --- [Visit Topic](https://discuss.tvm.ai/t/performing-relay-passes-non-recursively/5696/18) to respond.

[TVM Discuss] [Development/RFC] Performing Relay Passes Non-Recursively

2020-03-22 Thread Marisa Kirisame via TVM Discuss
writing trampoline by hand is probably the way to go for those passes. --- [Visit Topic](https://discuss.tvm.ai/t/performing-relay-passes-non-recursively/5696/17) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](htt

[TVM Discuss] [Development] [Discussion] Adding a function to Relay module automatically triggers InferType

2019-08-10 Thread Marisa Kirisame via TVM Discuss
@junrushao1994 there is no plan to make relay dynamically typed. However, you can still do the same by roughly: data Any = | AnyIntScalar (Tensor[(), Int]) | AnyFunction (Any -> Any) | AnyTuple (Any, Any) ... --- [Visit Topic](https://discuss.tvm.ai/t/discussion-adding-a-function-to-r

[TVM Discuss] [Development] [Discussion] Adding a function to Relay module automatically triggers InferType

2019-08-10 Thread Marisa Kirisame via TVM Discuss
@junrushao1994 there are PR to add mutual recursion into relay. In general, the only reason a relay program cant typecheck is because it is wrong, thus we want to catch this as early as possible. For case 1/2, it is still possible to call the type inference function yourself. What is your use

[TVM Discuss] [Development] [Relay][concatenate]Downcast from relay.RefType to relay.TensorType failed."

2019-08-07 Thread Marisa Kirisame via TVM Discuss
@Ruinhuang see https://github.com/dmlc/tvm/pull/3729. --- [Visit Topic](https://discuss.tvm.ai/t/relay-concatenate-downcast-from-relay-reftype-to-relay-tensortype-failed/3595/7) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [c

[TVM Discuss] [Development] [Relay][concatenate]Downcast from relay.RefType to relay.TensorType failed."

2019-08-07 Thread Marisa Kirisame via TVM Discuss
@Ruinhuang I fixed it in add_grad, will upstream rn. --- [Visit Topic](https://discuss.tvm.ai/t/relay-concatenate-downcast-from-relay-reftype-to-relay-tensortype-failed/3595/6) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [cl

[TVM Discuss] [RFC] [RFC] Implement "add_to" semantic in TVM

2019-08-05 Thread Marisa Kirisame via TVM Discuss
> I’m wondering whether it has any problem such as what @junrushao1994 > mentioned. If there are unknown alias/unknown add_to in other places of the code, it cannot be model as option 1. Let's hope it doesnt happend. --- [Visit Topic](https://discuss.tvm.ai/t/rfc-implement-add-to-semanti

[TVM Discuss] [RFC] [RFC] Implement "add_to" semantic in TVM

2019-08-05 Thread Marisa Kirisame via TVM Discuss
If I understand correctly, `add_to(a, b)` increment a, with b. During this process, the value of a will be changed. The second approach is a, imho, RED FLAG idea, that I dont think we should do. If add_to is implemented as above, it will greatly complicate Operator Fusion, Gradient, Partial Eva

[TVM Discuss] [Development] [Relay] Higher order AD broken in some cases

2019-06-25 Thread Marisa Kirisame via TVM Discuss
I know how to fix it, and I will in a few days. --- [Visit Topic](https://discuss.tvm.ai/t/relay-higher-order-ad-broken-in-some-cases/3036/3) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.ai/e