Hi @zhiics @comaniac,

I am using BYOC to offload transformers to external codegen tools. These 
transformers are composite functions. I had been using this feature well with 
my manually-generated annotation passes, but when I merge the latest changes to 
go through the `AnnotateGraph -> PartitionGraph` passes, I found that codegen 
fails because the generated function is wrong.

The transformer outputs a single value, and this value is used in three places 
in the model. However, the generated function returns this value as a 3-tuple:

```
    ...
    add(%268, %output_layernorm_bias2) /* ty=Tensor[(1, 64, 512), float32] */
  };
  %270 = %269(meta[relay.Constant][32] /* ty=Tensor[(512), float32] */ ...;
  (%270, %270, %270)
}
```

The return value should just be `%270`.

After checking the output of `AnnotateTarget`, I found that the issue is that a 
new `CompilerEnd` annotation is added each time this output is used. For 
example:

```
%395 = annotation.compiler_end(%394, meta[relay.attrs.CompilerAttrs][105])
...
%444 = annotation.compiler_end(%394, meta[relay.attrs.CompilerAttrs][140])
...
%475 = annotation.compiler_end(%394, meta[relay.attrs.CompilerAttrs][162])
```

This definitely seems like a bug, and is causing my codegen to break since the 
body of the function is a tuple rather than a call node. Is there a good 
workaround, or easy way to fix this?

Thanks!





---
[Visit 
Topic](https://discuss.tvm.ai/t/incorrect-generated-function-after-partitiongraph-pass/6380/1)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.ai/email/unsubscribe/109c6c25f713255145da94b4d3adee305876e587bca915341e2691fce1567e8d).

Reply via email to