The GitHub Actions job "PR" on tvm.git/fix/fuse-reduction-epilogue-relu has 
succeeded.
Run started by GitHub user kimm240 (triggered by kimm240).

Head commit for run:
d3b9c318c210cba13cb9466f2c26600b4f59de42 / hyun gyu kim <[email protected]>
[TIR][Schedule] FuseReductionEpilogue: Add ReLU support

The FuseReductionEpilogue primitive currently supports fusing bias addition
epilogues into reduction blocks. This commit extends the primitive to also
support ReLU activation functions in epilogue blocks, enabling fusion of
patterns like max(temp + bias, 0) into the reduction computation.

The implementation adds an EpilogueType enumeration to distinguish between
Bias and BiasReLU patterns. The AnalyzeEpiloguePattern method is extended
to detect ReLU patterns by checking for MaxNode expressions with zero
constants.

This commit also adds comprehensive tests in
test_tir_schedule_fuse_reduction_epilogue_relu.py, following the same
patterns as the existing bias tests. The tests verify structural equality,
numerical correctness with per-iteration ReLU semantics, and multiple
epilogue block scenarios. All tests pass successfully.

Report URL: https://github.com/apache/tvm/actions/runs/19662404722

With regards,
GitHub Actions via GitBox


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to