[Apache TVM Discuss] [Questions] [AutoTuning] How to debug when all trials are failing on GPU

2020-10-26 Thread jeremyj via Apache TVM Discuss
I meet the same problem. And I‘d like to ask which llvm version was working in the end? --- [Visit Topic](https://discuss.tvm.apache.org/t/autotuning-how-to-debug-when-all-trials-are-failing-on-gpu/2833/5) to respond. You are receiving this because you enabled mailing list mode. To unsu

[Apache TVM Discuss] [Questions] Relay cannot compile while_loop

2020-10-26 Thread Hao Luo via Apache TVM Discuss
Thank you very much. I have understand it --- [Visit Topic](https://discuss.tvm.apache.org/t/relay-cannot-compile-while-loop/8294/3) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/em

[Apache TVM Discuss] [Questions] How to match the pattern of a function in Relay?

2020-10-26 Thread Cody H. Yu via Apache TVM Discuss
Hmm, I’m not quite sure if pattern matcher will go into Relay functions for matching. I’ll check it later but maybe @mbrookhart could comment. Meanwhile, maybe we can make a new FunctionPattern that matches function nodes. --- [Visit Topic](https://discuss.tvm.apache.org/t/how-to-match-th

[Apache TVM Discuss] [Questions] How to match the pattern of a function in Relay?

2020-10-26 Thread moderato via Apache TVM Discuss
Thanks! I had checked that out, but seems it doesn't show a way to match a function. In my case conv+mul+add+relu is already wrapped into a function, so I failed to match them directly. One example in the tutorial related to function matching uses function attr, but it looks like the function

[Apache TVM Discuss] [Questions] How to match the pattern of a function in Relay?

2020-10-26 Thread Cody H. Yu via Apache TVM Discuss
Check this out: https://tvm.apache.org/docs/langref/relay_pattern.html --- [Visit Topic](https://discuss.tvm.apache.org/t/how-to-match-the-pattern-of-a-function-in-relay/8283/2) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [c

[Apache TVM Discuss] [Questions] Graph_plan_memory doesn't support nested tuples?

2020-10-26 Thread Jared Roesch via Apache TVM Discuss
@masahi FromTupleType is the one you probably want it takes a Type representing the layout of `expr` and returns a sequence of expressions which correspond to the linearized view of the tuple, i.e it will handle projecting nested tuples out. --- [Visit Topic](https://discuss.tvm.apache.o

[Apache TVM Discuss] [Questions] Relay cannot compile while_loop

2020-10-26 Thread masahi via Apache TVM Discuss
You cannot use `relay.build(...)` to build a model with control flow. For that, you need to use VM. See for example, https://github.com/apache/incubator-tvm/blob/efe3a79aacd934ea5ffb13170230bf199a473e72/tests/python/frontend/pytorch/test_forward.py#L1914 --- [Visit Topic](https://discu

[Apache TVM Discuss] [Questions] Relay cannot compile while_loop

2020-10-26 Thread Hao Luo via Apache TVM Discuss
I want to test the usage of relay while_loop and write the following simple example ``` x = relay.var("x", shape=(10, 20)) i = relay.var("i", shape=tuple(), dtype="int32") def myfun(x, i): z = relay.add(x, relay.const(1, "float32")) j = relay.add(i, relay.const(1, "in

[Apache TVM Discuss] [Questions] Graph_plan_memory doesn't support nested tuples?

2020-10-26 Thread masahi via Apache TVM Discuss
thanks, I'll take a look --- [Visit Topic](https://discuss.tvm.apache.org/t/graph-plan-memory-doesnt-support-nested-tuples/8278/8) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/emai

[Apache TVM Discuss] [Questions] Graph_plan_memory doesn't support nested tuples?

2020-10-26 Thread Zhi via Apache TVM Discuss
The helpers are here: https://github.com/apache/incubator-tvm/blob/98c2096f4944bdbdbbb2b7b20ccd35c6c11dfbf6/src/relay/op/memory/memory.cc#L287-L300 --- [Visit Topic](https://discuss.tvm.apache.org/t/graph-plan-memory-doesnt-support-nested-tuples/8278/7) to respond. You are receiving this

[Apache TVM Discuss] [Questions] Graph_plan_memory doesn't support nested tuples?

2020-10-26 Thread Jared Roesch via Apache TVM Discuss
There is a C++ helper called Linearize or FlattenTuple (can look later) --- [Visit Topic](https://discuss.tvm.apache.org/t/graph-plan-memory-doesnt-support-nested-tuples/8278/6) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [c

[Apache TVM Discuss] [Questions] Graph_plan_memory doesn't support nested tuples?

2020-10-26 Thread masahi via Apache TVM Discuss
Ok, thanks! I found the code Jared was probably referring to (t`ransform/memory_plan.py`, `transform/memory_alloc.py`, not sure why they are written in python). I'm going to learn about memory planning and see what I can do. --- [Visit Topic](https://discuss.tvm.apache.org/t/graph-plan-m

[Apache TVM Discuss] [Questions] Graph_plan_memory doesn't support nested tuples?

2020-10-26 Thread Jared Roesch via Apache TVM Discuss
@masahi there is code for doing this mapping inside of the VM, if you message me on Slack we can probably figure out how to update the code, might require a bit of debugging --- [Visit Topic](https://discuss.tvm.apache.org/t/graph-plan-memory-doesnt-support-nested-tuples/8278/4) to respo

[Apache TVM Discuss] [Questions] Graph_plan_memory doesn't support nested tuples?

2020-10-26 Thread tqchen via Apache TVM Discuss
Yes, we will need to update the code if we want to support nested tuple. Perhaps we can pass he token around also in nested tuples and unpack them. --- [Visit Topic](https://discuss.tvm.apache.org/t/graph-plan-memory-doesnt-support-nested-tuples/8278/3) to respond. You are receiving this

[Apache TVM Discuss] [Questions] Understanding TVM/Relay's PartitionGraph()(mod) function

2020-10-26 Thread Cody H. Yu via Apache TVM Discuss
Good catch @masahi :grinning: --- [Visit Topic](https://discuss.tvm.apache.org/t/understanding-tvm-relays-partitiongraph-mod-function/8290/5) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apac

[Apache TVM Discuss] [Questions] Understanding TVM/Relay's PartitionGraph()(mod) function

2020-10-26 Thread masahi via Apache TVM Discuss
Isn't it simply a problem of free variables? I suggest replacing ``` f = relay.Function([], result) ``` with ``` f = relay.Function(relay.analysis.free_vars(result), result) ``` --- [Visit Topic](https://discuss.tvm.apache.org/t/understanding-tvm-relays-partitiongraph-mod-function/8290/4)

[Apache TVM Discuss] [Questions] Understanding TVM/Relay's PartitionGraph()(mod) function

2020-10-26 Thread Cody H. Yu via Apache TVM Discuss
The recent PR should fix this: https://github.com/apache/incubator-tvm/pull/6641 See this unit test: https://github.com/apache/incubator-tvm/blob/main/tests/python/relay/test_pass_annotate_target.py#L358 --- [Visit Topic](https://discuss.tvm.apache.org/t/understanding-tvm-relays-partitio

[Apache TVM Discuss] [Questions] Understanding TVM/Relay's PartitionGraph()(mod) function

2020-10-26 Thread Matt Barrett via Apache TVM Discuss
Ping @comaniac @manupa-arm. I have a feeling the if/else handling in this pass might not be correct. Are you only seeing this problem when you have an If? --- [Visit Topic](https://discuss.tvm.apache.org/t/understanding-tvm-relays-partitiongraph-mod-function/8290/2) to respond. You are r

[Apache TVM Discuss] [Questions] Understanding TVM/Relay's PartitionGraph()(mod) function

2020-10-26 Thread jmatai via Apache TVM Discuss
Hi All, I am working on trying to understand TVM/Relay’s graph partitioning functionalities. Specifically, I have created the following simple example, and I am getting the error as follows. I understand that PartitionGraph() function assumes the graph is annotated with target with Annotat

[Apache TVM Discuss] [Questions] Cloning a NN model

2020-10-26 Thread M1k3 via Apache TVM Discuss
Is there an API to make a clone (aka deepcopy) of a module and params? Thanks --- [Visit Topic](https://discuss.tvm.apache.org/t/cloning-a-nn-model/8288/1) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://di

[Apache TVM Discuss] [Questions] Is sparse kernel supported in the tensor expression language?

2020-10-26 Thread Tristan Konolige via Apache TVM Discuss
@kwmaeng I've written the sparse_dense kernel for GPUs. It was a little bit of an arduous process, but here are my takeaways: - Using te only works for some sparse kernels. Sparse kernels are often written as functions over the input tensor. Unfortunately, te requires you to write your kernel

[Apache TVM Discuss] [Questions] Graph_plan_memory doesn't support nested tuples?

2020-10-26 Thread Matt Barrett via Apache TVM Discuss
We (@manupa-arm) ran into this in the graph partitioner. I think in the end we were forced to introduce logic to flatten such tuples, so if a more fundamental solution can be found that would simplify our logic. --- [Visit Topic](https://discuss.tvm.apache.org/t/graph-plan-memory-doesnt-s

[Apache TVM Discuss] [Questions] How to detect the pattern of a function?

2020-10-26 Thread moderato via Apache TVM Discuss
def @main(%data: Tensor[(1, 112, 112, 32), float32]) -> Tensor[(1, 112, 112, 64), float32] { %3 = fn (%p0: Tensor[(1, 112, 112, 32), float32], %p1: Tensor[(3, 3, 32, 1), float32], %p2: Tensor[(1, 1, 1, 32), float32], %p3: Tensor[(1, 1, 1, 32), float32], Primitive=1) -> Tensor[(1, 112