Hello,

I’m exploring TVM to optimize and port AI models on embedded devices. While 
running the tutorial code in TVM version `0.22.dev0`, I encountered an issue 
that I’d like some guidance on.

When converting the ResNet model to IR using the “End-to-End Optimized Model” 
official tutorial, I get the following error:

```
Traceback (most recent call last):
  File ".../base_fx_graph_translator.py", line 938, in _conv2d
    return self._conv2d_impl(
           ^^^^^^^^^^^^^^^^^^
  File ".../base_fx_graph_translator.py", line 925, in _conv2d_impl
    assert len(self.shape_of(bias)) == 1
               ^^^^^^^^^^^^^^^^^^^
  File ".../base_fx_graph_translator.py", line 84, in shape_of
    if not isinstance(tensor.struct_info, relax.TensorStructInfo):
                      ^^^^^^^^^^^^^^^^^^

tvm.error.InternalError: Check failed: (ptr) is false: The struct_info is not 
populated, check if you have normalized the expr
[16:55:59] /block_builder.cc:64: Warning: BlockBuilder destroyed with remaining 
blocks!
```

>From what I understand, the error occurs because when `bias` does not exist, a 
>`relax.op.null_value` transformation is applied. Then in `_conv2d_impl`, the 
>check for `None` seems to be skipped, causing the assertion to fail. (I’m not 
>sure my understanding is correct.)

Is there a proper way to handle this scenario so that the model can be 
successfully converted?

For reference, here’s the temporary workaround I used:
```
    def _conv2d_impl(
        self,
        x: relax.Expr,
        weight: relax.Expr,
        bias: Optional[relax.Expr],
        strides: Optional[Tuple],
        padding: Optional[Tuple],
        dilation: Optional[Tuple],
        groups: Optional[Tuple],
    ):
        conv2d = self.block_builder.emit(
            relax.op.nn.conv2d(
                x,
                weight,
                strides=strides,
                padding=padding,
                dilation=dilation,
                groups=groups,
                data_layout="NCHW",
                kernel_layout="OIHW",
                out_dtype="float32",
            )
        )

        
        if bias is None:
            return conv2d
        
        #add
        if isinstance(bias, relax.Call) and bias.op == relax.op.null_value().op:
            return conv2d

        assert len(self.shape_of(bias)) == 1
        bias = relax.op.reshape(bias, (1, -1, 1, 1))
        return self.block_builder.emit(relax.op.add(conv2d, bias))
```





---
[Visit 
Topic](https://discuss.tvm.apache.org/t/tutorial-code-does-not-work/18689/1) to 
respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.apache.org/email/unsubscribe/96b0dd60759c98a6e2df6df1a8b2e8b0d3ae7550106224ae31e1b70f7dc22dc7).

Reply via email to