Alexander Gnauck <[EMAIL PROTECTED]> writes:

> Hello,

Hi Alexander!

> im currently working on autogenerate tools for the tables in c# for
> the c# port of libIDN.

Thanks!

> The c# port has currently the same limitations as the Java Code. Simon
> asked me if this is smth that could be fixed. But i don't think
> so. The c# char's size is 16bit with the range U+0000 - U+FFFF.
> see:
> http://msdn.microsoft.com/library/default.asp?url=/library/en-us/csref/html/vclrfChar_PG.asp

I see.  This is the same problem that earlier versions of Java had.
Perhaps they will fix it in C# too.

> Are there any tests cases with unicode characters > U+FFFF?

For example:

IDNA(<U+10205><U+00ed>dn.example) = xn--dn-mja7734x.example

See also RFC 3454, especially table D.2.

> How is this handled in the C version?

I use uint32_t as the Unicode code point data type, it can store 32
bits.

Good luck,
Simon


_______________________________________________
Help-libidn mailing list
[email protected]
http://lists.gnu.org/mailman/listinfo/help-libidn

Reply via email to