> > Not true. When there is activation, the range is not always 0 ~ 255. For 
> > example RELU,
> 
> I believe tflite extends the quantization range so it always includes 0, as 
> done in the gemmlowp quantization example below. I have dumped my min and max 
> saturation input values from the six quantized tflite models (two mobilenets 
> and four inceptions). They are all 0 and 255.
> 
> `https://github.com/google/gemmlowp/blob/master/doc/quantization_example.cc`
> 
> ```
> // Given the min and max values of a float array, return
> // reasonable quantization parameters to use for this array.
> QuantizationParams ChooseQuantizationParams(float min, float max) {
>   // We extend the [min, max] interval to ensure that it contains 0.
>   // Otherwise, we would not meet the requirement that 0 be an exactly
>   // representable value.
>   min = std::min(min, 0.f);
>   max = std::max(max, 0.f);
> ```

I think you maybe don't understand fully of my previous comment. One question I 
want to ask: Do your quantized models have conv + relu / relu6 like our model? 
If no, obviously is 0 ~ 255, no matter how many models are. Please see: 
https://github.com/tensorflow/tensorflow/blob/v2.0.0-beta1/tensorflow/lite/kernels/kernel_util.cc#L138
  I and @jackwish have emphasized many times of this function code.

Please construct one quantized model like us:
![image](https://user-images.githubusercontent.com/7287321/59581062-36660e00-9106-11e9-93c1-2953571766f8.png)

I can make sure you will observe another result.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues/2351#issuecomment-502543075

Reply via email to