[Apache TVM Discuss] [Questions] [PassInfra] Relay Function Pass
Hello, In the [PassInfra Design and Development Doc](https://tvm.apache.org/docs/dev/pass_infra.html#pass-objects) the `function_pass` decorator is briefly explained. In the python codebase, there is also the `class FunctionPass`, objects of this type should be created with `function_pass`. It also states "works on each tvm.relay.Function in a module" https://github.com/apache/incubator-tvm/blob/master/python/tvm/relay/transform/transform.py#L85 I have the following code: ``` import tvm from tvm import relay dtype = 'float32' ishape = (1, 32, 14, 14) wshape = (32, 32, 3, 3) data1 = relay.var("data0", shape=ishape, dtype=dtype) weight1 = relay.var("weight0", shape=wshape, dtype=dtype) conv2d_1 = relay.nn.conv2d(data1, weight1, kernel_size=(3, 3), padding=(1, 1)) relu_1 = relay.nn.relu(conv2d_1) f = relay.Function([data1, weight1], relu_1) mod = tvm.IRModule() mod['main'] = f with tvm.transform.PassContext(opt_level=3): opt_mod, _ = relay.optimize(mod, 'llvm') #I am only doing this to force the module to have many internal functions and function calls x = relay.var("x", shape=(10, 20)) f1 = relay.Function([x], x) @relay.transform.function_pass(opt_level=1) class TestReplaceFunc: """Simple test function to replace one argument to another. After the first function""" def __init__(self, new_func): self.new_func = new_func self.i = 0 def transform_function(self, func, mod, ctx): if self.i == 0: return func else: self.i =1 return self.new_func fpass = TestReplaceFunc(f1) print(opt_mod) end_mod = fpass(opt_mod) print(end_mod) ``` The output of both `opt_mod` and `end_mod` are identical and if I set a break point at the ` if self.i == 0:`statement, I see it only stops once. The `opt_mod` has many (internal) functions, but only has one global function ('main'). * Why is it not recursing through the rest of the internal functions? If I do the same script as above but add another global function (ex:`mod['not_main']=another_relay_function`) before optimizing, I also noticed that `opt_mod` only has the 'main' function. * What pass is deleting the 'not_main' function? * there is no call to 'not_main' from 'main', so maybe dead code elimination? but I find it weird that it always preserves the 'main' function, * Is there a way to recurse through all global functions? * the `IRModule` has the attribute `functions`, but its a and I dont know how to get the keys or an iterator of them Thank you for the help :slight_smile: --- [Visit Topic](https://discuss.tvm.apache.org/t/passinfra-relay-function-pass/7979/1) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/email/unsubscribe/5736c2a9fab3957a88a2f09860bda8cd84e7360c929b4509796678fe16fc5ba9).
[Apache TVM Discuss] [Questions] How to match the pattern of a function in Relay?
I also +1 this feature. It seems that what you want to do is very similar to a previous post I had https://discuss.tvm.apache.org/t/relay-pattern-replacing-with-custom-relay-ops/7531/13 The biggest fear I have of introducing a new Relay operator (which I guess is what would eventually happen if we follow this path) was that it would have to work with all the Relay transformation passes. The example in the TVM documentation https://tvm.apache.org/docs/dev/relay_add_op.html seems to trivial and not sufficient to answer this question. Any guidance you could give would be gladly received :slight_smile: --- [Visit Topic](https://discuss.tvm.apache.org/t/how-to-match-the-pattern-of-a-function-in-relay/8283/12) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/email/unsubscribe/a0c15d76339ea9301e6a4ec961840dbf98fc699659375ffd78273ab6eec221cf).
[Apache TVM Discuss] [Questions] Passing a tuple to a packed func?
Hello everyone, I was wondering if there is a way to pass a tuple (or array) directly to a packed func. ``` tpl = tuple(1,2,3) tvm.tir.call_packed("func_name", tpl) ``` Reading the [packed_func documentation](https://tvm.apache.org/docs/dev/runtime.html?highlight=packed_func#packedfunc), I don't see a specific mention of a tuple being a valid input argument type. Nonetheless, it states that "TVM Object to represent any object in IR" would be allowed. I found that there is a [`tir.tvm_tuple`](https://tvm.apache.org/docs/api/doxygen/namespacetvm_1_1tir_1_1builtin.html#ab424ca353ceedd88a95fc37eeb9628a9), but when I try to use it from a python script I get the following error: ``` AttributeError: module 'tvm.tir' has no attribute 'tvm_tuple' ``` In the codebase, I only see 2 python examples of the `tvm_tuple` being used. 1. [In the TIR transform pipeline of VTA](https://github.com/apache/tvm/blob/main/vta/python/vta/transform.py#L780). But its being called inside a tvm.call_intrinsic. 2. [In a testing script](https://github.com/apache/tvm/blob/main/tests/python/unittest/test_tvmscript_roundtrip.py#L6939). But here it is being used inside of hybrid script, which for some reason doesnt throw the error. (Actually if I set a breakpoint there, I cant stop the execution there) Thanks --- [Visit Topic](https://discuss.tvm.apache.org/t/passing-a-tuple-to-a-packed-func/9117/1) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/email/unsubscribe/208d0f889c36c40aeedac515caef3c0e6f47956e567cda63140fb3551e396adc).
[Apache TVM Discuss] [Questions] Tiled tensors for external (python) functions and TIR == te.extern?
@eric-haibin-lin @ziheng any thoughts? I saw you were involved in [PR6079](https://github.com/apache/tvm/pull/6079) which seems to have been the last modifications to `te.extern` --- [Visit Topic](https://discuss.tvm.apache.org/t/tiled-tensors-for-external-python-functions-and-tir-te-extern/9083/2) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/email/unsubscribe/2b07ec245463f5ece8e2668ce7f17d1ccfa98a23cc4f6b016c6d2e5dafa31e5a).
[Apache TVM Discuss] [Questions] Passing a tuple to a packed func?
Thanks for your response, but I still dont understand what the solution is... except for decomposing the tuple into elementary arguments. --- [Visit Topic](https://discuss.tvm.apache.org/t/passing-a-tuple-to-a-packed-func/9117/3) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/email/unsubscribe/ff21d2d3646cec45f531de163b9798e62f63c57e887f414659b95c7ec8aa1b8a).
[Apache TVM Discuss] [Questions] Hi, does anyone know the difference between @reg.register_strategy and reg.register_compute?
Hey, Take this with a grain of salt since I am not an official voice of the people who developed those things. The `compute` should be to register the naive/non-optimize/non-hwdependent computation rule of a Relay operator. You can think about it as a golden reference. Now, there can be many algorithmic implementations of this computation especially for different HW backends. These different implementations are normally called `schedules` and all `schedules` were gathered into TOPI. At least to my understanding, a `strategy` is something somewhat more higher order than a `schedule` but its still describes and implementation variant of a `compute`. There could be moments when given the parameters of the operator and your HW, you can already decide on an algorithmic `strategy` and tune (or used an already optimized schedule) of the appropriate strategy. If I had to give an example (warning I am not sure it's like this in the repo): * Compute == the naive conv2d implementation * Strategy == {winograd, direct convolution, im2col transformation} * although all of these will give the same nummerical value, they have considerable difference in their implementations * Schedule == {winograd({target_0, target_1}), direct convolution({target_0,...,target_n-1}), im2col({target_3, target_n})} * as you can see not all targets have `schedule` (template) implementations for each of the different `strategies` Hope this helps --- [Visit Topic](https://discuss.tvm.apache.org/t/hi-does-anyone-know-the-difference-between-reg-register-strategy-and-reg-register-compute/11452/2) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/email/unsubscribe/2779ae8e817fa2d924c3e26383310101da17e1baab02937f412499ec10308dfc).
[Apache TVM Discuss] [Questions] Ops become slow when using te.var
I guess what you are seeing are potential cases when the variable can have any value in the range of the data type you selected and therefore for correctness all those if statements are necessary? Any reason you want to use te.var ? I am guessing due to some dynamic shape you want to support, but I think the TVM standard way of doing this is to JIT each time the values changed. --- [Visit Topic](https://discuss.tvm.apache.org/t/ops-become-slow-when-using-te-var/11486/2) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/email/unsubscribe/f64750157134d9e072e36d46b221059fd7e979c401e7d5c340275ceaaba2cdc9).
[Apache TVM Discuss] [Questions] Ops become slow when using te.var
Just inline the one stage into the other one? EDIT: wait your if statements require variables which are not defined (blockIdx.x andThreadIdx.x) --- [Visit Topic](https://discuss.tvm.apache.org/t/ops-become-slow-when-using-te-var/11486/4) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/email/unsubscribe/d60e4dc363237bb139bbcb74e23315dca9577f5678b7cb522ee479d4f554ffe6).
[Apache TVM Discuss] [Questions] How to add my custom Relay node to pattern matcher? [EthosU example]
Hello, I was looking at the Arm EthosU integration in TVM and [noticed that there was a new conv2d Relay operator defined](https://github.com/apache/tvm/blob/main/python/tvm/relay/backend/contrib/ethosu/op/convolution.py#L185). Obviously this operator is only legal/valid for offloading onto the EthosU, [so they decide to replace standard Relay operator patterns inside of a Composite with this single operator](https://github.com/apache/tvm/blob/main/python/tvm/relay/backend/contrib/ethosu/legalize.py#L126). That makes it easier to define their own Relay->Tir pipeline since they only need to parse one node and all its parameters/attributes. I was wondering though, how could this (or any) new Relay operator be added into the pattern matching infrastructure? I guess what I would like is the following ``` #Original Relay module using standard Relay operators %1 = standard_relay_op_1(...) %2 = standard_relay_op_2(...) #Assume I have some custom Relay operators that each match to one of those from above %1 = my_custom_relay_op_1(...) %2 = my_custom_relay_op_2(...) #Now, I further have a third Relay operator which matches exactly to the pattern above %1 = my_custom_rela_op_3() ``` I know in this simple case there is a solution using the already available pattern matcher infrastructure (i.e. defining the pattern to be match of `my_custom_relay_op_3` before the others in the pattern list). But let's say for more complex situations I couldn't just solve it via position in the pattern list. How do I add `my_custom_relay_op_{1,2}` to the pattern matching infrastructure such that I can define the pattern matching for `my_custom_relay_op_3` to have statements/expression like `is_op('my_custom_relay_op_1')` ? Thank you all --- [Visit Topic](https://discuss.tvm.apache.org/t/how-to-add-my-custom-relay-node-to-pattern-matcher-ethosu-example/11498/1) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/email/unsubscribe/afd28cebad62f95679f29998044d6e6e9d61e5440a9d22e725a44a23655447b7).
[Apache TVM Discuss] [Questions] How to add my custom Relay node to pattern matcher? [EthosU example]
@manupa-arm @matt-arm So maybe I can ask more directly. Is the ordering of first pattern matching your offloadable and then replacing, within the extracted composite, the native relay operators with your new ethosu.conv2d relay operator a solution to not being able to do what I said before? The following would feel like a simpler pipeline 1. Find and replace in the main Relay module the native Relay op pattern which describes your `ethosu.` (just like you did it in your legalization process) 2. PartitionGraph given that `ethosu.` is offloadable 3. Relay-TIR now works like you already have it --- [Visit Topic](https://discuss.tvm.apache.org/t/how-to-add-my-custom-relay-node-to-pattern-matcher-ethosu-example/11498/2) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/email/unsubscribe/2d200c2ebea313a19a46c8e50a53fd213da781d929efc8628952740d5042cecf).
[Apache TVM Discuss] [Questions] Repeatable build/compile
Yes, different autoTVM/autoscheduler runs on the same network can yield different implementations. If you know your model doesnt change, you would autoschedule once and save the log files of the optimized implementation. If for some reason you need to recompile your model you would tell TVM to just look for this logfile with the implementations so you get the same as before --- [Visit Topic](https://discuss.tvm.apache.org/t/repeatable-build-compile/11516/2) to respond. You are receiving this because you enabled mailing list mode. To unsubscribe from these emails, [click here](https://discuss.tvm.apache.org/email/unsubscribe/20d8a339bb1f7afd4e3c102b1e257483d0f6dffb84058bea9b8041d10d869af9).