gemini-code-assist[bot] commented on code in PR #18658:
URL: https://github.com/apache/tvm/pull/18658#discussion_r2681380542
##########
python/tvm/relax/frontend/onnx/onnx_frontend.py:
##########
@@ -1127,7 +1127,31 @@ class PRelu(OnnxOpConverter):
def _impl_v1(cls, bb, inputs, attr, params):
x = inputs[0]
slope = inputs[1]
- return relax.op.nn.prelu(x, slope)
+
+ x_shape = x.struct_info.shape
+ slope_shape = slope.struct_info.shape
+
+ ndim = len(x_shape)
+ s_ndim = len(slope_shape)
+
+ if all(ss == 1 for ss in slope_shape) or s_ndim == 1:
+ slope = relax.op.reshape(slope, (slope_shape[0],))
+ return relax.op.nn.prelu(x, slope, ndim - 1)
Review Comment:

The logic for handling 1D or scalar-like slopes might not align with the
ONNX PReLU operator specification. This implementation defaults to applying the
slope on the last dimension (`axis = ndim - 1`).
While this is a valid interpretation of general ONNX broadcasting rules, the
PReLU operator has specific semantics. For instance, older versions of the ONNX
PReLU specification (e.g., opset 7) explicitly state that for a scalar slope
(shape `(1,)`) or a 1D per-channel slope (shape `(C,)`), the slope should be
applied along the channel axis, which is typically `axis=1` for `NCHW` data
formats.
Using `ndim - 1` could lead to incorrect behavior when the channel axis is
not the last dimension. For example, with an input of shape `(N, C, H, W)` and
a slope of shape `(C,)`, this implementation would incorrectly try to apply the
slope along the `W` axis.
To be more robust and compliant with the common use of PReLU, I recommend
using `axis=1` for these cases. This would align with the operator's typical
per-channel behavior.
```suggestion
return relax.op.nn.prelu(x, slope, 1)
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]