https://gcc.gnu.org/bugzilla/show_bug.cgi?id=85439
--- Comment #1 from Marc "Foddex" Oude Kotte <foddex at foddex dot net> --- When one adds std::hex output formatting for the first four std::cout calls, for example the first line would then look like: std::cout << "32 bits gen, uint32_t: " << std::hex << generateRandom<std::mt19937, std::uniform_int_distribution, uint32_t>() << std::endl; The output on MSVC and OSX becomes: 32 bits gen, uint32_t: d091bb5c 32 bits gen, uint64_t: d091bb5c22ae9ef6 64 bits gen, uint32_t: f6f6aea6 64 bits gen, uint64_t: c96d191cf6f6aea6 And the output on Linux becomes: 32 bits gen, uint32_t: d091bb5c 32 bits gen, uint64_t: d091bb5c22ae9ef6 64 bits gen, uint32_t: c96d191d 64 bits gen, uint64_t: c96d191cf6f6aea6 A number of observations: - on line 3, MSVC + OSX seem to return the lower 32 bits of line 4 - on line 3, Linux seems to return the upper 32 bits of line 4, with 1 added But: - on line 1, everybody just seems to return the upper 32 bits of line 2 I am at a loss who is right here.