I

> @jnorwood Have read again the long discussions, I finally understand what you 
> are trying to say. Let me ask this question: considering ReLU6 in float, do 
> you think it is saturating input float values into [0, 6]?


The 0..6.0 float clamping is applied during training if relu6 is used as 
activation.  It may also be used to force  the range for creating the downscale 
constants and offsets that are used in inference.  That seems so, from your 
activation code excerpt.  

The gemmlowp example indicates that they always extend a range if it doesn't 
include 0.  I believe their reason was that an exact zero representation is 
needed in the range... perhaps for padding.  I didn't see that in the 
activation code excerpt, but perhaps that is handled elsewhere.

On the quantized inference side, those min and max values are applied after the 
downscale and offset are applied, and  it seems to me more appropriate to 
recognize that they are needed for the quantization bits saturation whether or 
not an activation operation was used in the training model.  

I've only seen 0 and 255 for those input min and max values in the six 
quantized tflite models I've converted.  I dumped them all to check.

No, there is no saturation  being applied to input values during inference.  
The input values are uint8 in the tflite models.  There is extra info stored in 
the model file indicating the input range and offset.  In some model operations 
that input info is needed for rescale ... For example in the multiple input 
concat operations in the inception_v3 model, the input ranges are different, so 
a rescale is required. 

The tf training models associated with the quantized tflite models have 
activation and bn operations that are  effectively fused together with the 
conv, along with the fake quantization ops.  No separate activation nodes 
appear in the associated inference models.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues/2351#issuecomment-502943470

Reply via email to