@jackwish If relu activations are used, there is no need to use half of the representation space for negative values; thus the extra bit of precision.
-- You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub: https://github.com/dmlc/tvm/issues/2351#issuecomment-497990478