Hi, 

Does QNN support int8 --> uint8 or uint8 --> int8 pre-quantized model 
conversion? If no, is there a plan to support it?

Tag @anijain2305 cause you are fantastic! Thank you!





---
[Visit 
Topic](https://discuss.tvm.apache.org/t/support-for-pre-quantized-model-int8-uint8-conversion/8064/1)
 to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.tvm.apache.org/email/unsubscribe/7e178b8f74528898dc02c766cd2eaaa68e1af576f487eafd69afbec09a5f9ec9).

Reply via email to