This makes sense.

Best Regards
Zhenhua


eqy <notificati...@github.com> 于2019年6月2日周日 上午9:29写道:

> @jackwish <https://github.com/jackwish>
> If relu activations are used, there is no need to use half of the
> representation space for negative values; thus the extra bit of precision.
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <https://github.com/dmlc/tvm/issues/2351?email_source=notifications&email_token=ABFVHDPXH4PBFFVSL2MXIQTPYMO55A5CNFSM4GMOMOS2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODWXLWTQ#issuecomment-497990478>,
> or mute the thread
> <https://github.com/notifications/unsubscribe-auth/ABFVHDM763W7HEQXQ55I6PLPYMO55ANCNFSM4GMOMOSQ>
> .
>


-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/dmlc/tvm/issues/2351#issuecomment-498035641

Reply via email to