Hi All,

I am working on trying to understand TVM/Relay’s graph partitioning 
functionalities. Specifically, I have created the following simple example, and 
I am getting the error as follows. 

I understand that PartitionGraph() function assumes the graph is annotated with 
target with AnnotateTarget([“target”]) function. Based on my reading, I have 
written the following example, to be able to partition the “add” operator into 
a sperate function (I understand that using Relay Pattern language or 
Traversing AST, I can partition the add into a separate relay function), but 
here I am trying to understand how PartitionGraph() works for a simple cases. 

Here is my code:

```
graph_type =1


def _register_external_op_helper(op_name, supported=True):

    @tvm.ir.register_op_attr(op_name, "target.special")
    def _func_wrapper(attrs, args):
        return supported

    return _func_wrapper


_register_external_op_helper("add")
_register_external_op_helper("subtract")



if graph_type == 1:
    # this is test case for graph type 1
    print("Graph type 1")

    # graph 1: true branch
    x1 = relay.var('x', shape=(10, 1))
    y1 = relay.var('y', shape=(10, 1))

    # graph 2: false branch
    x2 = relay.var('x', shape=(10, 1))
    y2 = relay.var('y', shape=(10, 1))

    f1 = relay.op.add(x1, y1)

    f2 = relay.op.multiply(x2, y2)

    cond = relay.var('c')
    result = relay.If(cond, true_branch=f1, false_branch=f2)
    f = relay.Function([], result)

    mod = tvm.IRModule({"main": f})

    mod = relay.transform.AnnotateTarget(["special"])(mod)  # ==> It GIVES 
ERROR here
    mod = relay.transform.PartitionGraph()(mod)  # 
```

Here is the error that I got stuck.

```
Graph type 1
Traceback (most recent call last):
  File "C:\Program Files\JetBrains\PyCharm 
2020.1.2\plugins\python\helpers\pydev\pydevd.py", line 1438, in _exec
    pydev_imports.execfile(file, globals, locals)  # execute the script
  File "C:\Program Files\JetBrains\PyCharm 
2020.1.2\plugins\python\helpers\pydev\_pydev_imps\_pydev_execfile.py", line 18, 
in execfile
    exec(compile(contents+"\n", file, 'exec'), glob, loc)
  File "C:/repos/tvm23/tvm/graph_opt/subgraph/PartitionGraphTry.py", line 48, 
in <module>
    mod = relay.transform.AnnotateTarget(["special"])(mod)  # Output: Figure 2
  File "C:\repos\tvm23\tvm\python\tvm\ir\transform.py", line 127, in __call__
    return _ffi_transform_api.RunPass(self, mod)
  File "C:\repos\tvm23\tvm\python\tvm\_ffi\_ctypes\packed_func.py", line 237, 
in __call__
    raise get_last_ffi_error()
tvm._ffi.base.TVMError: Traceback (most recent call last):
  File "C:\repos\tvm23\tvm\src\ir\module.cc", line 192
TVMError: Check failed: fv.size() == 0 (5 vs. 0) : There are free variables: 
[Var(c, ty=TensorType([], bool)), Var(x, ty=TensorType([10, 1], float32)), 
Var(y, ty=TensorType([10, 1], float32)), Var(x, ty=TensorType([10, 1], 
float32)), Var(y, ty=TensorType([10, 1], float32))] in function: #[version = 
"0.0.5"]
fn () -> Tensor[(10, 1), float32] {
  free_var %c: bool;
  if (%c) {
    free_var %x: Tensor[(10, 1), float32];
    free_var %y: Tensor[(10, 1), float32];
    add(%x, %y) /* ty=Tensor[(10, 1), float32] */
  } else {
    free_var %x1: Tensor[(10, 1), float32];
    free_var %y1: Tensor[(10, 1), float32];
    multiply(%x1, %y1) /* ty=Tensor[(10, 1), float32] */
  }
}

```





---
[Visit 
Topic](https://discuss.tvm.apache.org/t/understanding-tvm-relays-partitiongraph-mod-function/8290/1)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.apache.org/email/unsubscribe/2383d10eef97fbe6d00e2787b74cc7251225edc7ce10ac572a77546e444cdc58).

Reply via email to