I think it is not implemented per se.
There is a [`BatchNormToInferUnpack` 
function](https://github.com/apache/incubator-tvm/blob/78d79923756ea9ed4545d2faef7d514a300d3452/src/relay/transforms/simplify_inference.cc#L34),
 part of the [SimplifyInference 
pass](https://tvm.apache.org/docs/api/python/relay/transform.html#tvm.relay.transform.SimplifyInference)
 defined later in that file. Similarly, dropout is deleted.

I do not think we currently have a training equivalent (which one would need to 
position in a way that it can be overridden for people with their own batch 
norm implementation), I'm using my own pass for the `*Norm` and to generate the 
dropout masks externally.

Best regards

Thomas





---
[Visit 
Topic](https://discuss.tvm.ai/t/where-is-the-batch-normalization-implementation-specified-in-tvm/7120/2)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.ai/email/unsubscribe/ec4d74e6651c0ee479dfcb275feda63ced88b092a132684419871dc286e197a7).

Reply via email to