https://gcc.gnu.org/bugzilla/show_bug.cgi?id=61414

--- Comment #12 from Jonathan Wakely <redi at gcc dot gnu.org> ---
(In reply to eric-bugs from comment #9)
> This does not seem like correct behavior to me either. The warning should be
> based on the maximum declared enum value, not the maximum possible value
> held by the underlying type.

No.

> After all as of C++17, the standard makes it undefined what happens if you
> try to stuff an integer into an enum value that doesn't correspond to one of
> the values listed in the enum declaration.

What? No it doesn't.

Reply via email to