I have a small collection of relay passes that I found useful and that might be 
interesting for more general use:
- Merge consecutive transpose ops (useful for converted models when the other 
source framework had different conventions, I'm using PyTorch),
- merge equal (shape) constants to enable CSE (also for converted models, but 
probably also our own),
- specialize `_like` ops for static shapes to `_to` (and similar) ops 
(broadcast, collapse_sum, zeros, ones, reshape) - useful after taking gradients 
because can reduce the complexity of the graph immensely. Removing 
broadcast/collapse_sum when not needed,
- remove *1 and +0 (with tensor ones and zeros, these appear in gradients),
- spell out training LayerNorm (batch norm etc. could be done, too).

I implemented these (mostly) using the PatternMatching in Python. Moving them 
to TVM likely involves going to C++.

Are these useful and if so,  would we want to automatically include them in 
optimization (and how)?

Best regards

Thomas





---
[Visit Topic](https://discuss.tvm.ai/t/what-to-do-with-passes/7147/1) to 
respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.ai/email/unsubscribe/fd1b806a2e1f6c841207676c29e87e90489307fc0422e5e03fa07a582448565f).

Reply via email to