Robert Bradshaw, 10.08.2012 22:07:
> On Fri, Aug 10, 2012 at 12:31 PM, Robert Bradshaw wrote:
>> On Fri, Aug 10, 2012 at 12:10 PM, Stefan Behnel wrote:
>>> Robert Bradshaw, 10.08.2012 20:54:
>>>> OK, the problem boiled down to using 'b' or 'B' for the format string
>>>> for char rather than 'c'. See, e.g.
>>>> http://docs.python.org/library/array.html . I've pushed a fix at
>>>> https://github.com/robertwb/cython/commit/b0539cbc32c200a09b1fbddf2d6943e92aec2f3e
>>>> , but it'd be good to have another set of eyes with someone more
>>>> familiar with this code to be sure I didn't miss anything. It also
>>>> involved a lot of changes to the tests
>>>> https://github.com/robertwb/cython/commit/6bcc8fd8419e6e4079344788023029736610d5aa
>>>> so is not exactly backwards compatible, though perhaps more correct.
>>>
>>> That's unfortunate, but since it's a bug, it needs fixing.
>>
>> Actually, this fix breaks numpy_test, and I'm not sure how to fix it
>> given that numpy has no notion of a "possibly signed char" (unless we
>> disallow using char/char* with numpy altogether, which is much more
>> extreme, or don't enforce that signs match for char, which is likely
>> more invasive). Ugh.

Many people only run their code on one particular platform (usually some
x86 flavour), often just on their own machine. Disallowing it all together
sounds overly extreme in that light.


> I suppose the lack of numpy support for (no-signed) char does explain
> why we were in the state we were in. I've pushed another commit that
> lets unsigned char == char == signed char for the purposes of buffer
> comparison (though not transitively, so unsigned char != signed char).

I find that behaviour acceptable. "char" is underdefined in C and the only
really safe value range is 0..127. So we can leave it to users to get it
right if they really want to use it.

Stefan

_______________________________________________
cython-devel mailing list
cython-devel@python.org
http://mail.python.org/mailman/listinfo/cython-devel

Reply via email to