Trying to find out why the to_unicode tests of libidn2 fail since a few
months...

It happens on OSX Travis-CI runner, all the infos I have are

$ clang --version
Apple LLVM version 8.1.0 (clang-802.0.42)
Target: x86_64-apple-darwin16.7.0
Thread model: posix
InstalledDir:
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin
before_install.1

locale_charset() returns with "UTF-8".

u8_strconv_to_locale() and u8_strconv_from_locale() seem not to work as
expected:


One problem seems to be that u8_strconv_to_locale() outputs decomposed
characters, e.g. u8_strconv_to_locale(bücher.de) returns b"ucher.de.

Hex/u32:

Result: U+0062 U+0022 U+0075 U+0063 U+0068 U+0065 U+0072 U+002e U+0064
U+0065)

Expected: U+0062 U+00fc U+0063 U+0068 U+0065 U+0072 U+002e U+0064 U+0065



The second problem is that characters beyond 255 are translated into ?
(U+003f).


Do you have any hints how to fix these problems ?
I would expect u8_strconv_to_locale() to work in a defined manner on
UTF-8 locales - but maybe I am wrong. I could apply a normalization step
in the test itself, but not sure if that is the correct solution.

For problem 2 I see no solution right now.


With Best Regards, Tim


Attachment: signature.asc
Description: OpenPGP digital signature

Reply via email to