https://gcc.gnu.org/bugzilla/show_bug.cgi?id=110890

            Bug ID: 110890
           Summary: std::is_array and std::extent incorrectly choose a
                    partial specialization when the size of an array
                    exceeds INT32_MAX
           Product: gcc
           Version: 9.4.0
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: c++
          Assignee: unassigned at gcc dot gnu.org
          Reporter: y1079700998 at gmail dot com
  Target Milestone: ---

#include <type_traits>
#include <cstdint>
#include <iostream>

using T1 = char[1ul + INT32_MAX];

int main() {
    std::cout << std::is_array_v<T1> << '\n';
    std::cout << std::extent_v<T1, 0> << '\n';
}

When compiled with `g++ array_test.cpp -g -std=c++17 -Wall -Wextra
-fno-strict-aliasing -fwrapv -fno-aggressive-loop-optimizations
-fsanitize=undefined`, this script generates a result of "0 0". This behavior
was also observed with GCC 13.2 on godbolt.

When using clang++(tested with 10.0.0), however, the script generates the
expected result of "1 2147483648".

I discovered this behavior while performing some edge testing. It's probably
very rare to have an array of such a size in a real application.

Reply via email to