[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-16 Thread Matthew Brookhart via TVM Discuss
https://github.com/apache/incubator-tvm/pull/5826 --- [Visit Topic](https://discuss.tvm.ai/t/dynamic-ops-in-relay/6909/16) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/5e76

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-13 Thread tqchen via TVM Discuss
Indeed A1 can address the problem better: In the proposal, there is a dyn to static pass, this pass will try to convert constants to attributes as much as possible. After this pass, all of the constant shape reshape will become static, and then we can apply CSE easily. Of course, we can also

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-13 Thread Thomas V via TVM Discuss
Just a quick shout regarding potential difficulties: I think this difficulty in common subexpression elimination with reshape is a consequence of the A0 approach for reshape: https://discuss.tvm.ai/t/discuss-pass-for-merging-shape-tensors/6955 Best regards Thomas --- [Visit Topic](https:

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-11 Thread tqchen via TVM Discuss
To keep things simple, we can disallow symbolic var in attributes and force attributes to be constant, so if the var is symbolic dependent, we should use the dyn variant. --- [Visit Topic](https://discuss.tvm.ai/t/dynamic-ops-in-relay/6909/13) to respond. You are receiving this because yo

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-11 Thread Haichen Shen via TVM Discuss
I'm also in favor of A1 approach. I have one more question to dynamic ops. Currently Relay allows to use symbolic var to represent a dimension. In the world of A1, if attributes contains a symbolic var, such as new shape in `reshape`, are we treating the op as a dynamic op or static op? -

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-10 Thread Josh Fromm via TVM Discuss
Just wanted to add a +1 to A1 as it seems like the best way to gradually move TVM towards dynamism without breaking things in the meantime. --- [Visit Topic](https://discuss.tvm.ai/t/dynamic-ops-in-relay/6909/11) to respond. You are receiving this because you enabled mailing list mode. To

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-10 Thread tqchen via TVM Discuss
Seems we have converged on A1 with the additional clarifications in this thread :slight_smile: --- [Visit Topic](https://discuss.tvm.ai/t/dynamic-ops-in-relay/6909/10) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here]

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-10 Thread Lixiaoquan via TVM Discuss
Now I think dynamic rank support should be a separated issue. Maybe we can discuss that in another thread. --- [Visit Topic](https://discuss.tvm.ai/t/dynamic-ops-in-relay/6909/9) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [c

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-09 Thread Matthew Brookhart via TVM Discuss
@kevinthesun @tqchen Are you guys agreeing to A1? @lixiaoquan I haven't put much thought into dynamic rank. I'm not sure how we would produce it with the current opset, do you have any use cases in mind? --- [Visit Topic](https://discuss.tvm.ai/t/dynamic-ops-in-relay/6909/8) to respond. Y

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-09 Thread Yao Wang via TVM Discuss
Make sense. For API user, we will provide more dynamic API to support any input cases. In the backend, we separate the purely static case(Probably requires no shape func?) and dynamic cases to make it easier to maintain related passed. --- [Visit Topic](https://discuss.tvm.ai/t/dynamic-o

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-09 Thread tqchen via TVM Discuss
I think main topic of interest here is the way we define function signatures, not necessarily how to broaden the scope of the same function to support more flexible inputs(e.g. `Any`) I think the main goal of A1 concerns the semantics of the attribute. Since attribute has always been conside

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-09 Thread Lixiaoquan via TVM Discuss
Any thought about dynamic rank support? --- [Visit Topic](https://discuss.tvm.ai/t/dynamic-ops-in-relay/6909/5) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.ai/email/unsubscribe/7f73acbedfa6698

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-08 Thread Yao Wang via TVM Discuss
Correct me if my understanding is wrong, is the goal of A1 to finally merge static and dynamic ops into a single dynamic API which input tensors allows dynamic input and attributes only allows constants(Like TensorFlow)? Also in terms of the boundary of static and dynamic ops, we still need to

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-08 Thread tqchen via TVM Discuss
Both A0 and A1 should be able to reduce the complexity of the frontend logic, as conversion can always goes to their dynamic variants, and then follows the conversion promotes the dynamic variants to the static counterpart. >From the interface design PoV. A0 somewhat creates additional duplica

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-08 Thread Yao Wang via TVM Discuss
@mbrookhart Thank you for this RFC. IMHO, one advantage of A0 is it is more user friendly to just have a unified API to handle both static and dynamic shape cases. Though this adds complexity of type inference of each op, it reduces 1) number of relay ops. 2) complexity of frontend logic to ha

[TVM Discuss] [Development/RFC] Dynamic Ops in Relay

2020-06-08 Thread Matthew Brookhart via TVM Discuss
Frameworks and state of the art models are moving more and more toward dynamism, where the shapes of tensors in a model are calculated at runtime, either from the shapes of inputs or from the values of inputs. There are a number of efforts underway in TVM to better support dynamic models, inc