[Cython] Upcoming issues with NumPy deprecated APIs and Cython's sizeof checks

2012-01-30 Thread Lisandro Dalcin
I'm testing my code with numpy-dev. They are trying to discourage use
of deprecated APIs, this includes direct access to the ndarray struct.
In order to update your code, you have to pass -DNPY_NO_DEPRECATED_API
to the C compiler (or #define it before including NumPy headers).

However, they have implemented this feature by exposing the ndarray
type with just the Python object header:
https://github.com/numpy/numpy/blob/master/numpy/core/include/numpy/ndarraytypes.h#L695

Obviously, this interact bad with Cython's sizeof check, I'm getting
this runtime warning:

build/lib.linux-x86_64-2.7/petsc4py/lib/__init__.py:64:
RuntimeWarning: numpy.ndarray size changed, may indicate binary
incompatibility

I think there is nothing Cython can do about this (other than
special-casing NumPy to disable this VERY useful warning).

 I've tried the patch below with success, but I'm not convinced...
Does any of you have a suggestion for NumPy folks about how to improve
this?


diff --git a/numpy/core/include/numpy/ndarraytypes.h
b/numpy/core/include/numpy/ndarraytypes.h
index 0288272..1fcbf52 100644
--- a/numpy/core/include/numpy/ndarraytypes.h
+++ b/numpy/core/include/numpy/ndarraytypes.h
@@ -695,6 +695,7 @@ typedef struct tagPyArrayObject_fields {
 #ifdef NPY_NO_DEPRECATED_API
 typedef struct tagPyArrayObject {
 PyObject_HEAD
+char _npy_array_fields[sizeof(PyArrayObject_fields)-sizeof(PyObject)];
 } PyArrayObject;
 #else
 /*

-- 
Lisandro Dalcin
---
CIMEC (INTEC/CONICET-UNL)
Predio CONICET-Santa Fe
Colectora RN 168 Km 472, Paraje El Pozo
3000 Santa Fe, Argentina
Tel: +54-342-4511594 (ext 1011)
Tel/Fax: +54-342-4511169
___
cython-devel mailing list
cython-devel@python.org
http://mail.python.org/mailman/listinfo/cython-devel


Re: [Cython] [cython-users] Re: How to find out where an AttributeError is ignored

2012-01-30 Thread Robert Bradshaw
On Fri, Jan 27, 2012 at 1:01 PM, Stefan Behnel  wrote:
> Dag Sverre Seljebotn, 27.01.2012 21:03:
>> On 01/27/2012 05:58 PM, Stefan Behnel wrote:
>>> mark florisson, 27.01.2012 17:30:
 On 27 January 2012 16:22, mark florisson  wrote:
> On 27 January 2012 15:47, Simon King  wrote:
>> Hi all,
>>
>> I am still *very* frustrated about the fact that Cython does not tell
>> where the error occurs. Since about one week, I am adding lots and
>> lots of lines into Sage that write a log into some file, so that I get
>> at least some idea where the error occurs. But still: Even these
>> extensive logs do not provide a hint on what exactly is happening.
>>
>> How can I patch Cython such that some more information on the location
>> of the error is printed? I unpacked Sage's Cython spkg, and did "grep -
>> R ignored .", but the code lines containing the word "ignored" did not
>> seem to be the lines that are responsible for printing the warning
>> message
>>    Exception AttributeError: 'PolynomialRing_field_with_category'
>> object has no attribute '_modulus' in  ignored
>>
>> Can you point me to the file in Sage's Cython spkg which is
>> responsible for printing the warning?
>>
>> Best regards,
>> Simon
>
> These messages are written by PyErr_WriteUnraisable, which is a
> CPython C API function that writes unraisable exceptions. There are
> typically two reasons for unraisable exceptions:
>
>     1) as Robert mentioned, a function that does not allow propagation
> of exceptions, e.g.
>
>         cdef int func():
>             raise Exception
>
>         Here there is no way to propagate the raised exception, so
> instead one should write something like
>
>             cdef int func() except -1: ...
>
>         Alternatively one may use 'except *' in case there is no error
> indicator and Cython should always check, or "except ? -1" which means
> "-1 may or may not indicate an error".
>
>     2) in deallocators or finalizers (e.g. __dealloc__ or __del__)
>
> For functions the right thing is to add an except clause, for
> finalizers and destructors one could use the traceback module, e.g.
>
>     try:
>         ...
>     except:
>         traceback.print_exc()
>
> If this all still doesn't help, try setting a (deferred) breakpoint on
> __Pyx_WriteUnraisable or PyErr_WriteUnraisable.

 Actually, I don't see why the default is to write unraisable
 exceptions. Instead Cython could detect that exceptions may propagate
 and have callers do the check (i.e. make it implicitly "except *").
>>
>> As for speed, there's optimizations on this, e.g., "except? 32434623" if
>> the return type is int, "except? 0xf..." if the return type is a pointer.
>>
>> And for floating point, we could make our own NaN -- that's obscure enough
>> that it could probably be made "except cython.cython_exception_nan" by
>> default, not "except? cython.cython_exception_nan".
>
> The problem with that is that we can't be sure that Cython will be the only
> caller. So exceptions may still not propagate in cases, and users will have
> to know about these "obscure" values and that they must deal with them
> manually then.
>
> You could add that we'd just have to disable this when user code takes a
> pointer from a function, but then, how many rules are there that users will
> have to learn and remember after such a change? And what's that for a
> language that changes the calling semantics of a function because way down
> in the code someone happens to take a pointer to it?
>
>
 Was this not implemented because Cython only knows whether functions
 may propagate exceptions at code generation time by looking at the
 presence of an error label?
 Maybe it could keep code insertion points around for every call to
 such a potential function and if the function uses the error label
 have the caller perform the check? Although I do forsee problems for
 external such functions... maybe Cython could have it's own
 threadstate regardless of the GIL which would indicate whether an
 error has occurred? e.g. CyErr_Occurred()?
>>>
>>> Yep, those are the kind of reasons why writing unraisable exceptions is the
>>> default.
>>
>> Still,
>
> I wasn't really advocating this behaviour, just indicating that it's hard
> to do "better", because this "better" isn't all that clear. It's also not
> "better" for all code, which means that we get from one trade-off to
> another, while breaking existing code at the same time. Not exactly
> paradise on either side of the tunnel.

I still feel like we're stuck in the wrong default. I'd rather require
more work to interact with C libraries than require more work to
convert innocent-looking Python to Cython.

> One example that keeps popping up in my mind is callback fu

Re: [Cython] 0.16 release

2012-01-30 Thread Robert Bradshaw
On Sat, Jan 28, 2012 at 8:05 AM, Vitja Makarov  wrote:
> 2012/1/26 Jason Grout :
>> On 1/25/12 11:39 AM, Robert Bradshaw wrote:
>>>
>>> install
>>>
>>> https://sage.math.washington.edu:8091/hudson/view/ext-libs/job/sage-build/lastSuccessfulBuild/artifact/cython-devel.spkg
>>> by downloading it and running "sage -i cython-devel.spkg"
>>
>>
>>
>> In fact, you could just do
>>
>> sage -i
>> https://sage.math.washington.edu:8091/hudson/view/ext-libs/job/sage-build/lastSuccessfulBuild/artifact/cython-devel.spkg
>>
>> and Sage will (at least, should) download it for you, so that's even one
>> less step!
>>
>> Jason
>>
>
> Thanks for detailed instruction! I've successfully built it.
>
> "sage -t -gdb ./" doesn't work, is that a bug?
>
> vitja@mchome:~/Downloads/sage-4.8$ ./sage  -t -gdb
> devel/sage/sage/combinat/sf/macdonald.py
> sage -t -gdb "devel/sage/sage/combinat/sf/macdonald.py"
> 
> Type r at the (gdb) prompt to run the doctests.
> Type bt if there is a crash to see a traceback.
> 
> gdb --args python /home/vitja/.sage//tmp/macdonald_6182.py
> starting cmd gdb --args python /home/vitja/.sage//tmp/macdonald_6182.py
> ImportError: No module named site
>         [0.2 s]
>
> --
> The following tests failed:
>
>
>        sage -t -gdb "devel/sage/sage/combinat/sf/macdonald.py"
> Total time for all tests: 0.2 seconds

Yes, that's a bug.

> I've found another way to run tests (using sage -sh and then direct
> python ~/.sage/tmp/...py)
>
> So I found one of the problems. Here is minimal cython example:
>
> def foo(values):
>    return (0,)*len(values)
> foo([1,2,3])
>
> len(values) somehow is passed as an integer to PyObject_Multiply()

Yeah, that's a bug too :).
___
cython-devel mailing list
cython-devel@python.org
http://mail.python.org/mailman/listinfo/cython-devel