> So, this is not about activation.

Of course it comes from activation, and is related to zero point and scale. 
Maybe you can read the whole implementation rather than read secondhand message.

For this min/max activation:
1. They are even named with activation when used in computing kernel:
https://github.com/tensorflow/tensorflow/blob/v2.0.0-beta1/tensorflow/lite/kernels/internal/reference/conv.h#L174-L175
2. The min/max is generated at the *prepare* stage of convolution:
https://github.com/tensorflow/tensorflow/blob/v2.0.0-beta1/tensorflow/lite/kernels/conv.cc#L312-L318
3. The function in 2 eventually calls:
https://github.com/tensorflow/tensorflow/blob/v2.0.0-beta1/tensorflow/lite/kernels/kernel_util.cc#L138-L163
4. Min/max are set to the representable value range of a data type ONLY when 
there is no activation is found in the fused operator.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues/2351#issuecomment-502508081

Reply via email to