[Cython] documentation tarball
Would it be too much trouble to post a snapshot of the cython-docs tree along with a cython release? I see the git repository for cython-docs, but it would be more convenient for an end-user who wants a copy of the docs to not have to search through the commit log to find the right commit corresponding to a release. For example, it would be great if a zip or tarball of the documentation for 0.14.1 was posted on the cython.org homepage in the Download section, perhaps with text like this: The latest release of Cython is 0.14.1 (released 2011-02-04). You can download it as a gzipped tar or as a zip file. The documentation is also available (gzipped tar or zip). Thanks, Jason ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
Re: [Cython] Fwd: Re: Cython builds on various Debian platforms
On Wed, 16 Feb 2011, Lisandro Dalcin wrote: > > | AssertionError > > `--- > > what could be done about it or should it be excluded? > I've pushed some fixes. Now this testcase should run from ancient > Python 2.3 to head Python 3.2, both for static and sharedlib builds > (but not in Windows). first of all THANKS for the patch -- I picked it up into 0.14.1-2 debian package. Now tests are enabled and I just uploaded 0.14.1-2 into Debian -- lets see how it would go across architectures ;-) 2nd -- THANKS for ...: at first I got confused why I saw no commits since Dec in HG clone of Cython I had, and then mentioned that you moved to using GIT and github. Awesome and thank you for taking care about my sanity (although unintentionally I guess ;) ) -- =--= Keep in touch www.onerussian.com Yaroslav Halchenko www.ohloh.net/accounts/yarikoptic ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
[Cython] Outdated `hg export` on cython-devel homepage
The `hg export` on the cython-devel page [1] should probably be changed to (or additionally list) `git format-patch`, now that Cython's versioned in Git. Keeping the `hg export` reference might be useful for Mercurial lovers using hg-git. [1]: http://mail.python.org/mailman/listinfo/cython-devel -- This email may be signed or encrypted with GPG (http://www.gnupg.org). The GPG signature (if present) will be attached as 'signature.asc'. For more information, see http://en.wikipedia.org/wiki/Pretty_Good_Privacy My public key is at http://www.physics.drexel.edu/~wking/pubkey.txt pgpx5jnKkfqoR.pgp Description: PGP signature ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
Re: [Cython] [cython-users] Cython .pxd introspection: listing defined constants
This thread is coming over to cython-dev (and the new cython-devel) from cython-users because it turns out it will probably require chaning the Cython code. To get everyone who hasn't been following on cython-users up to speed, here's a summary of what I'm trying to do: That's what I was trying to give with this: On Wed, Feb 09, 2011 at 12:23:25PM -0500, W. Trevor King wrote: > I'm wrapping an external C library with Cython, so I have `mylib.pxd`: > > cdef extern from 'mylib.h' > enum: CONST_A > enum: CONST_B > ... > > where I declare each constant macro from the library's header `mylib.h`: > > #define CONST_A 1 > #define CONST_B 2 > ... > > Now I want to expose those constants in Python, so I have `expose.pyx`: > > cimport mylib > > CONST_A = mylib.CONST_A > CONST_B = mylib.CONST_B > ... > > But the last part seems pretty silly. I'd like to do something like > > cimport mylib > import sys > > for name in dir(mylib): > setattr(sys.modules[__name__], name, getattr(mylib, name)) > > which compiles fine, but fails to import with... Looking into the Cython internals, everything defined in mylib.pxd is stored as `Entry`s in a `ModuleScope`, and... On Wed, Feb 16, 2011 at 03:55:19PM -0800, Robert Bradshaw wrote: > On Wed, Feb 16, 2011 at 8:17 AM, W. Trevor King wrote: > > What I'm missing is a way to bind the ModuleScope namespace to a name > > in expose.pyx so that commands like `dir(mylib)` and `getattr(mylib, > > name)` will work in expose.pyx. > > You have also hit into the thorny issue that .pxd files are used for > many things. They may be pure C library declarations with no Python > module backing, they may be declarations of (externally implemented) > Python modules (such as numpy.pxd), or they may be declarations for > Cython-implemented modules. > > It seems like it would be easier to generate some kind of wrapper > > class (PxdModule?) for mylib when it is cimported (at compile time), > > and then further interactions would take care of themselves (at run > > time). > > Would such an object be created anew for every module that cimports > the declaration file? Hmm, That doesn't sound very nice, does it. However, .pxd files declaring C libraries have no Python-space presence, so that was my initial idea. > I have toyed with the idea of subclassing the module object itself for > better support of C-level attributes from the Python (and Cython) > namespaces. Sorry, I don't understand "better support of C-level attributes". Can you give an example? > Here's another idea, what if extern blocks could contain cpdef > declarations, which would automatically generate a Python-level > wrappers for the declared members (if possible, otherwise an error)? Ah, this sounds good! Of the three .pxd roles you list above, external Python modules (e.g. numpy) and Cython-implemented modules (e.g. matched .pxd/.pyx) both already have a presence in Python-space. What's missing is a way to give (where possible) declarations of external C libraries a Python presence. cpdef fills this hole nicely, since its whole purpose is to expose Python interfaces to C-based elements. A side effect of this cpdef change would be that now even bare .pxd files (no matching .pyx) would have a Python presence, so You could do something like cimport mylib as mylib_c import mylib as mylib_py import sys # Access through Python for name in dir(mylib_py): setattr(sys.modules[__name__], name, getattr(mylib_py, name)) # Direct C access cdef get_a(): return mylib_c.CONST_A Where the Python access would be the new feature, and list all cpdef-ed stuff. However, from Parsing.py:2369: error(pos, "C struct/union/enum cannot be declared cpdef") From pyrex_differences.rst: If a function is declared :keyword:`cpdef` it can be called from and overridden by both extension and normal python subclasses. I believe the reason that cpdef-ed enums and similar are currently illegal is confusion between "can be called from Python" and "can be overridden from Python". I think these should be just like methods already are, in that you can "override" a method by subclassing it, but not by rebinding the name in the base class: >>> import pyximport; pyximport.install() >>> import rectangle as R >>> r = R.Rectangle(1, 2, 3, 4) >>> r.area = lambda(self): r.x1' Traceback (most recent call last): File "", line 1, in AttributeError: 'rectangle.Rectangle' object attribute 'area' is read-only Where rectangle.pyx is a minorly patched version of the last example from early_binding_for_speed.rst [1] and `area` is a cpdef-ed method. Why can't enums share this handling, with the enum taking the place of the method and the enum's module taking the place of the class? After all, enums have a Python-side tyoe (int or long). Unions don't really have a Python parallel, but structs do, so long as you can select which attributes should ha
Re: [Cython] Outdated `hg export` on cython-devel homepage
W. Trevor King, 17.02.2011 14:51: The `hg export` on the cython-devel page [1] should probably be changed to (or additionally list) `git format-patch`, now that Cython's versioned in Git. Keeping the `hg export` reference might be useful for Mercurial lovers using hg-git. [1]: http://mail.python.org/mailman/listinfo/cython-devel Fixed, thanks! Stefan ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
Re: [Cython] [cython-users] Cython .pxd introspection: listing defined constants
On Thu, Feb 17, 2011 at 08:29:41AM -0500, W. Trevor King wrote: > cpdef struct Foo: > cpdef public int intA > cpdef readonly int intB > cdef void *ptr Oops, for consistency with classes, the variables declarations should read `cdef public` and and `cdef readonly`. Perhaps `cdef struct` too, to match `cdef class`? I get a bit confused, because for some things (functions, methods) `cpdef` adds a Python interface. For others (attributes) it's `cdef public/readonly`. There are even some things (classes), where a plain `cdef` is enough to provide a Python interface. Perhaps I am just missing some subtle distinction between the effects of the various incantations? -- This email may be signed or encrypted with GPG (http://www.gnupg.org). The GPG signature (if present) will be attached as 'signature.asc'. For more information, see http://en.wikipedia.org/wiki/Pretty_Good_Privacy My public key is at http://www.physics.drexel.edu/~wking/pubkey.txt pgpNmFAslJJpG.pgp Description: PGP signature ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
[Cython] [PATCH] Add .gitignores to cython and cython-docs
Here's a pair of patches doing just that. I also ignore *.c in cython, because all .c files are currently auto-generated. Perhaps that will not always be the case? If it seems to risky, feel free to leave that part out. -- This email may be signed or encrypted with GPG (http://www.gnupg.org). The GPG signature (if present) will be attached as 'signature.asc'. For more information, see http://en.wikipedia.org/wiki/Pretty_Good_Privacy My public key is at http://www.physics.drexel.edu/~wking/pubkey.txt From a3916ac0ac058e43c9aa75e0a66312618be73edf Mon Sep 17 00:00:00 2001 From: W. Trevor King Date: Thu, 17 Feb 2011 09:16:21 -0500 Subject: [PATCH] Add *.c to .hgignore and create analogous .gitignore. --- .gitignore | 15 +++ .hgignore |1 + 2 files changed, 16 insertions(+), 0 deletions(-) create mode 100644 .gitignore diff --git a/.gitignore b/.gitignore new file mode 100644 index 000..2187b82 --- /dev/null +++ b/.gitignore @@ -0,0 +1,15 @@ +*.pyc +*.c +*.swp + +Cython/Compiler/Lexicon.pickle +BUILD/ +build/ +dist/ +.coverage +*~ +*.orig +*.rej +*.dep + +tags diff --git a/.hgignore b/.hgignore index 6fa7131..e005c72 100644 --- a/.hgignore +++ b/.hgignore @@ -1,6 +1,7 @@ syntax: glob *.pyc +*.c *.swp Cython/Compiler/Lexicon.pickle -- 1.7.3.4 From bf5bf5d3874cdcf34940b634c56dbeb35faa4137 Mon Sep 17 00:00:00 2001 From: W. Trevor King Date: Thu, 17 Feb 2011 09:20:30 -0500 Subject: [PATCH 2/2] Create .gitignore analogous to current .hgignore. --- .gitignore |6 ++ 1 files changed, 6 insertions(+), 0 deletions(-) create mode 100644 .gitignore diff --git a/.gitignore b/.gitignore new file mode 100644 index 000..d431956 --- /dev/null +++ b/.gitignore @@ -0,0 +1,6 @@ +*.pyc +*~ +.*.swp + +build/ +_build/ -- 1.7.3.4 pgpAONMwGmDpc.pgp Description: PGP signature ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
[Cython] Fixing NumPy support for Python 3 (Stefan, please help!)
Stefan, what do you think about the patch below? This hunk is part of a series of fixes required to get numpy-dev working under Python 3.2. The root of the issue is that __cythonbufferdefaults__ keys&values end-up being "bytes" (this coercion is triggered in Interpreter.py). diff --git a/Cython/Compiler/ExprNodes.py b/Cython/Compiler/ExprNodes.py index 5b339da..b72deef 100755 --- a/Cython/Compiler/ExprNodes.py +++ b/Cython/Compiler/ExprNodes.py @@ -12,6 +12,7 @@ cython.declare(error=object, warning=object, warn_once=object, Builtin=object, Symtab=object, Utils=object, find_coercion_error debug_disposal_code=object, debug_temp_alloc=object, debug_coerc +import sys import operator from Errors import error, warning, warn_once, InternalError, CompileError @@ -1136,6 +1137,8 @@ class StringNode(PyConstNode): return self.result_code def compile_time_value(self, env): +if sys.version_info[0] >= 3 and self.unicode_value: +return self.unicode_value return self.value -- Lisandro Dalcin --- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo 3000 Santa Fe, Argentina Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
Re: [Cython] [cython-users] Cython .pxd introspection: listing defined constants
On Thu, Feb 17, 2011 at 10:53:15AM -0300, Lisandro Dalcin wrote: > Cython could certainly support "cpdef struct", it is just a matter to > define a proposal and find a contributor to implement it :-) Is there a CEP template (a la PEPs 9 and 12) that should be discussed on the mailing list, or do I develop it free-form on the wiki [2]? p.s. should I be picking one of cython-dev@codespeak or cython-devel@python? Is the shift not yet official? [2]: http://wiki.cython.org/enhancements/ -- This email may be signed or encrypted with GPG (http://www.gnupg.org). The GPG signature (if present) will be attached as 'signature.asc'. For more information, see http://en.wikipedia.org/wiki/Pretty_Good_Privacy My public key is at http://www.physics.drexel.edu/~wking/pubkey.txt pgptjVXzT6lHZ.pgp Description: PGP signature ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
Re: [Cython] [cython-users] Cython .pxd introspection: listing defined constants
On 17 February 2011 11:35, W. Trevor King wrote: > On Thu, Feb 17, 2011 at 10:53:15AM -0300, Lisandro Dalcin wrote: >> Cython could certainly support "cpdef struct", it is just a matter to >> define a proposal and find a contributor to implement it :-) > > Is there a CEP template (a la PEPs 9 and 12) that should be discussed > on the mailing list, or do I develop it free-form on the wiki [2]? > I would just develop it free-form on the wiki > p.s. should I be picking one of cython-dev@codespeak or > cython-devel@python? Is the shift not yet official? > Use cython-devel@python.org PS: Do we really need a full CEP for this? Do any of you object "cpdef struct" to automatically create a wrapper extension type for exposing to Python? I think all wee need to discuss is how to implement __cinit__(), and what to do with slots you want to use in C but cannot be mapped to a Python type (like pointers). -- Lisandro Dalcin --- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo 3000 Santa Fe, Argentina Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
Re: [Cython] Fixing NumPy support for Python 3 (Stefan, please help!)
Lisandro Dalcin, 17.02.2011 15:32: Stefan, what do you think about the patch below? This hunk is part of a series of fixes required to get numpy-dev working under Python 3.2. The root of the issue is that __cythonbufferdefaults__ keys&values end-up being "bytes" (this coercion is triggered in Interpreter.py). diff --git a/Cython/Compiler/ExprNodes.py b/Cython/Compiler/ExprNodes.py index 5b339da..b72deef 100755 --- a/Cython/Compiler/ExprNodes.py +++ b/Cython/Compiler/ExprNodes.py @@ -12,6 +12,7 @@ cython.declare(error=object, warning=object, warn_once=object, Builtin=object, Symtab=object, Utils=object, find_coercion_error debug_disposal_code=object, debug_temp_alloc=object, debug_coerc +import sys import operator from Errors import error, warning, warn_once, InternalError, CompileError @@ -1136,6 +1137,8 @@ class StringNode(PyConstNode): return self.result_code def compile_time_value(self, env): +if sys.version_info[0]>= 3 and self.unicode_value: You must use "self.unicode_value is not None" here, it may be the empty string. +return self.unicode_value return self.value Ok, that's a tricky one. Just because the compilation is running in Py3 doesn't mean that the correct compile time value is a Unicode string - we don't know what it'll be used for. Doing the above will do the wrong thing e.g. in this case: DEF const_x = "abc" cdef str x = const_x The problem is: it is broken already, returning self.value is wrong because it drops available type information by returning plain bytes instead of str. And simply returning self.unicode_value in Py3 doesn't fix that. Stefan ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
[Cython] python 2.7/3.x and numpy-dev (Dag, I need a quick comment)
I'm working on a patch to get old, recent, and dev NumPy working in 2.7/3.x. So far, I had success, but I still have two failures like the one pasted below. Dag, could you elaborate a bit about the purpose of __Pyx_BufFmt_CheckString() ? It is just a validity check for pep 3118 format strings? Do you expect the failure below to be hard to fix? Just in case, the format string that triggers the failure is: >>> memoryview(np.zeros((1,), dtype=np.dtype('b,i', align=False))).format 'T{b:f0:=i:f1:}' == FAIL: numpy_test () Doctest: numpy_test -- Traceback (most recent call last): File "/usr/local/python/3.2/lib/python3.2/doctest.py", line 2113, in runTest raise self.failureException(self.format_failure(new.getvalue())) AssertionError: Failed doctest test for numpy_test File "/u/dalcinl/Devel/Cython/cython/BUILD/run/c/numpy_test.cpython-32dm.so", line 1, in numpy_test -- File "/u/dalcinl/Devel/Cython/cython/BUILD/run/c/numpy_test.cpython-32dm.so", line 155, in numpy_test Failed example: print(test_packed_align(np.zeros((1,), dtype=np.dtype('b,i', align=False Exception raised: Traceback (most recent call last): File "/usr/local/python/3.2/lib/python3.2/doctest.py", line 1248, in __run compileflags, 1), test.globs) File "", line 1, in print(test_packed_align(np.zeros((1,), dtype=np.dtype('b,i', align=False File "numpy_test.pyx", line 404, in numpy_test.test_packed_align (numpy_test.c:6367) ValueError: Buffer packing mode currently only allowed at beginning of format string (this is a defect) -- Lisandro Dalcin --- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo 3000 Santa Fe, Argentina Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
Re: [Cython] [cython-users] Cython .pxd introspection: listing defined constants
On Thu, Feb 17, 2011 at 5:29 AM, W. Trevor King wrote: > This thread is coming over to cython-dev (and the new cython-devel) > from cython-users because it turns out it will probably require > chaning the Cython code. To get everyone who hasn't been following on > cython-users up to speed, here's a summary of what I'm trying to do: > > That's what I was trying to give with this: > > On Wed, Feb 09, 2011 at 12:23:25PM -0500, W. Trevor King wrote: >> I'm wrapping an external C library with Cython, so I have `mylib.pxd`: >> >> cdef extern from 'mylib.h' >> enum: CONST_A >> enum: CONST_B >> ... >> >> where I declare each constant macro from the library's header `mylib.h`: >> >> #define CONST_A 1 >> #define CONST_B 2 >> ... >> >> Now I want to expose those constants in Python, so I have `expose.pyx`: >> >> cimport mylib >> >> CONST_A = mylib.CONST_A >> CONST_B = mylib.CONST_B >> ... >> >> But the last part seems pretty silly. I'd like to do something like >> >> cimport mylib >> import sys >> >> for name in dir(mylib): >> setattr(sys.modules[__name__], name, getattr(mylib, name)) >> >> which compiles fine, but fails to import with... > > Looking into the Cython internals, everything defined in mylib.pxd is > stored as `Entry`s in a `ModuleScope`, and... > > On Wed, Feb 16, 2011 at 03:55:19PM -0800, Robert Bradshaw wrote: >> On Wed, Feb 16, 2011 at 8:17 AM, W. Trevor King wrote: >> > What I'm missing is a way to bind the ModuleScope namespace to a name >> > in expose.pyx so that commands like `dir(mylib)` and `getattr(mylib, >> > name)` will work in expose.pyx. >> >> You have also hit into the thorny issue that .pxd files are used for >> many things. They may be pure C library declarations with no Python >> module backing, they may be declarations of (externally implemented) >> Python modules (such as numpy.pxd), or they may be declarations for >> Cython-implemented modules. > >> > It seems like it would be easier to generate some kind of wrapper >> > class (PxdModule?) for mylib when it is cimported (at compile time), >> > and then further interactions would take care of themselves (at run >> > time). >> >> Would such an object be created anew for every module that cimports >> the declaration file? > > Hmm, That doesn't sound very nice, does it. However, .pxd files > declaring C libraries have no Python-space presence, so that was my > initial idea. > >> I have toyed with the idea of subclassing the module object itself for >> better support of C-level attributes from the Python (and Cython) >> namespaces. > > Sorry, I don't understand "better support of C-level attributes". Can > you give an example? The extern cpdef declarations are an example of this. >> Here's another idea, what if extern blocks could contain cpdef >> declarations, which would automatically generate a Python-level >> wrappers for the declared members (if possible, otherwise an error)? > > Ah, this sounds good! Of the three .pxd roles you list above, > external Python modules (e.g. numpy) and Cython-implemented modules > (e.g. matched .pxd/.pyx) both already have a presence in Python-space. > What's missing is a way to give (where possible) declarations of > external C libraries a Python presence. cpdef fills this hole nicely, > since its whole purpose is to expose Python interfaces to > C-based elements. In the case of external Python modules, I'm not so sure we want to monkey-patch our stuff in (and where would we do it--on the first import of a cimporting module?) > A side effect of this cpdef change would be that now even bare .pxd > files (no matching .pyx) would have a Python presence, Where would it live? Would we just create this module (in essence, acting as if there was an empty .pyx file sitting there as well)? On this note, it may be worth pursuing the idea of a "cython helper" module where common code and objects could live. > so You could do > something like > > cimport mylib as mylib_c > import mylib as mylib_py > import sys > > # Access through Python > for name in dir(mylib_py): > setattr(sys.modules[__name__], name, getattr(mylib_py, name)) I think this smells worse than "import *" > # Direct C access > cdef get_a(): > return mylib_c.CONST_A > > Where the Python access would be the new feature, and list all > cpdef-ed stuff. > > However, from Parsing.py:2369: > > error(pos, "C struct/union/enum cannot be declared cpdef") > > From pyrex_differences.rst: > > If a function is declared :keyword:`cpdef` it can be called from > and overridden by both extension and normal python subclasses. > > I believe the reason that cpdef-ed enums and similar are currently > illegal is confusion between "can be called from Python" and "can be > overridden from Python". The reason that error statement is there is because it had no meaning, so an error was better than just ignoring it. > I think these should be just like methods > already are, in th
Re: [Cython] [cython-users] Cython .pxd introspection: listing defined constants
On Thu, Feb 17, 2011 at 01:25:10PM -0800, Robert Bradshaw wrote: > On Thu, Feb 17, 2011 at 5:29 AM, W. Trevor King wrote: > > On Wed, Feb 16, 2011 at 03:55:19PM -0800, Robert Bradshaw wrote: > >> On Wed, Feb 16, 2011 at 8:17 AM, W. Trevor King wrote: > >> > What I'm missing is a way to bind the ModuleScope namespace to a name > >> > in expose.pyx so that commands like `dir(mylib)` and `getattr(mylib, > >> > name)` will work in expose.pyx. > >> > >> You have also hit into the thorny issue that .pxd files are used for > >> many things. They may be pure C library declarations with no Python > >> module backing, they may be declarations of (externally implemented) > >> Python modules (such as numpy.pxd), or they may be declarations for > >> Cython-implemented modules. > >> > >> Here's another idea, what if extern blocks could contain cpdef > >> declarations, which would automatically generate a Python-level > >> wrappers for the declared members (if possible, otherwise an error)? > > > > Ah, this sounds good! Of the three .pxd roles you list above, > > external Python modules (e.g. numpy) and Cython-implemented modules > > (e.g. matched .pxd/.pyx) both already have a presence in Python-space. > > What's missing is a way to give (where possible) declarations of > > external C libraries a Python presence. cpdef fills this hole nicely, > > since its whole purpose is to expose Python interfaces to > > C-based elements. > > In the case of external Python modules, I'm not so sure we want to > monkey-patch our stuff in I don't think any of the changes we are suggesting would require changes to existing code, so .pxd-s with external implementations wouldn't be affected unless they brough the changes upon themselves. > (and where would we do it--on the first import of a cimporting > module?) Compilation is an issue. I think that .pxd files should be able to be cythoned directly, since then they Cython can build any wrappers they request. If the file has a matching .pyx file, cythoning either one should compile both together, since they'll produce a single Python .so module. > > A side effect of this cpdef change would be that now even bare .pxd > > files (no matching .pyx) would have a Python presence, > > Where would it live? Would we just create this module (in essence, > acting as if there was an empty .pyx file sitting there as well)? On > this note, it may be worth pursuing the idea of a "cython helper" > module where common code and objects could live. I'm not sure exactly what you mean by "cython helper", but this sounds like my 'bare .pyx can create a Python .so module idea above. > > so You could do > > something like > > > > cimport mylib as mylib_c > > import mylib as mylib_py > > import sys > > > > # Access through Python > > for name in dir(mylib_py): > > setattr(sys.modules[__name__], name, getattr(mylib_py, name)) > > I think this smells worse than "import *" Aha, thanks ;). I was stuck in my old .pxd-files-don't-create-modules-by-themselves mindset. Obviously, once they do, any Python code can access the contents directly and I can throw out all this indirection. > > However, from Parsing.py:2369: > > > >error(pos, "C struct/union/enum cannot be declared cpdef") > > > > From pyrex_differences.rst: > > > >If a function is declared :keyword:`cpdef` it can be called from > >and overridden by both extension and normal python subclasses. > > > > I believe the reason that cpdef-ed enums and similar are currently > > illegal is confusion between "can be called from Python" and "can be > > overridden from Python". > > The reason that error statement is there is because it had no meaning, > so an error was better than just ignoring it. Why does it have no meaning? I understand that it's not implemented yet, but a cpdef-ed enum or struct seems just as valid an idea as a cpdef-ed method. > > Unions don't really have a Python parallel, > > They can be a cdef class wrapping the union type. But I would think coercion would be difficult. Unions are usually (in my limited experience) for "don't worry about the type, just make sure it fits in X bytes". How would union->Python conversion work? > >cpdef struct Foo: > >cpdef public int intA > >cpdef readonly int intB > >cdef void *ptr > > > > We would both declare the important members of the C struct (as we can > > already do in Cython) and also have Cython automatically generate a > > Python class wrapping the struct (because of `cpdef struct`). The > > Python class would have: > > > > * Cython-generated getter/setter for intA (because of `cpdef public`) > > using the standard Python<->int coercion. > > * Similar Cython-generated getter for int B (because of `cpdef > > readonly`). > > * No Python access to ptr (standard C-access still possible through > > Cython). > > > > Doing something crazy like `cdef public void *ptr` would raise a > > compile-time error. > > Yes, all of the abov
Re: [Cython] [cython-users] Cython .pxd introspection: listing defined constants
Forgot reply-all... didin't we have this discussion before about making that the default for this list as it is by-far the most common desired behavior? On Thu, Feb 17, 2011 at 3:53 PM, Robert Bradshaw wrote: > On Thu, Feb 17, 2011 at 3:12 PM, W. Trevor King wrote: >> On Thu, Feb 17, 2011 at 01:25:10PM -0800, Robert Bradshaw wrote: >>> On Thu, Feb 17, 2011 at 5:29 AM, W. Trevor King wrote: >>> > On Wed, Feb 16, 2011 at 03:55:19PM -0800, Robert Bradshaw wrote: >>> >> On Wed, Feb 16, 2011 at 8:17 AM, W. Trevor King wrote: >>> >> > What I'm missing is a way to bind the ModuleScope namespace to a name >>> >> > in expose.pyx so that commands like `dir(mylib)` and `getattr(mylib, >>> >> > name)` will work in expose.pyx. >>> >> >>> >> You have also hit into the thorny issue that .pxd files are used for >>> >> many things. They may be pure C library declarations with no Python >>> >> module backing, they may be declarations of (externally implemented) >>> >> Python modules (such as numpy.pxd), or they may be declarations for >>> >> Cython-implemented modules. >>> >> >>> >> Here's another idea, what if extern blocks could contain cpdef >>> >> declarations, which would automatically generate a Python-level >>> >> wrappers for the declared members (if possible, otherwise an error)? >>> > >>> > Ah, this sounds good! Of the three .pxd roles you list above, >>> > external Python modules (e.g. numpy) and Cython-implemented modules >>> > (e.g. matched .pxd/.pyx) both already have a presence in Python-space. >>> > What's missing is a way to give (where possible) declarations of >>> > external C libraries a Python presence. cpdef fills this hole nicely, >>> > since its whole purpose is to expose Python interfaces to >>> > C-based elements. >>> >>> In the case of external Python modules, I'm not so sure we want to >>> monkey-patch our stuff in >> >> I don't think any of the changes we are suggesting would require >> changes to existing code, so .pxd-s with external implementations >> wouldn't be affected unless they brough the changes upon themselves. > > Say, in numpy.pxd, I have > > cdef extern from "...": > cpdef struct obscure_internal_struct: > ... > > Do we add an "obscure_internal_struct" onto the (global) numpy module? > What if it conflicts with a (runtime) name? This is the issue I'm > bringing up. > >>> (and where would we do it--on the first import of a cimporting >>> module?) >> >> Compilation is an issue. I think that .pxd files should be able to be >> cythoned directly, since then they Cython can build any wrappers they >> request. If the file has a matching .pyx file, cythoning either one >> should compile both together, since they'll produce a single Python >> .so module. > > In this case, I think it may make more sense to consider cimporting > stuff from .pyx files if no .pxd file is present. In any case, this is > a big change. I don't like the idea of always creating such a module > (it may be empty, or have name conflicts) nor the idea of > conditionally compiling it (does one have to look at the contents and > really understand Cython to see if a Python shadow will be created?) > >>> > A side effect of this cpdef change would be that now even bare .pxd >>> > files (no matching .pyx) would have a Python presence, >>> >>> Where would it live? Would we just create this module (in essence, >>> acting as if there was an empty .pyx file sitting there as well)? On >>> this note, it may be worth pursuing the idea of a "cython helper" >>> module where common code and objects could live. >> >> I'm not sure exactly what you mean by "cython helper", but this sounds >> like my 'bare .pyx can create a Python .so module idea above. > > I'm thinking of a place to put, e.g. the generator and bind-able > function classes, which are now re-implemented in every module that > uses them. I think there will be more cases like this in the future > rather than less. C-level code could be #included and linked from > "global" stores as well. However, that's somewhat tangential. > >>> > so You could do >>> > something like >>> > >>> > cimport mylib as mylib_c >>> > import mylib as mylib_py >>> > import sys >>> > >>> > # Access through Python >>> > for name in dir(mylib_py): >>> > setattr(sys.modules[__name__], name, getattr(mylib_py, name)) >>> >>> I think this smells worse than "import *" >> >> Aha, thanks ;). I was stuck in my old >> .pxd-files-don't-create-modules-by-themselves mindset. Obviously, >> once they do, any Python code can access the contents directly and I >> can throw out all this indirection. >> >>> > However, from Parsing.py:2369: >>> > >>> > error(pos, "C struct/union/enum cannot be declared cpdef") >>> > >>> > From pyrex_differences.rst: >>> > >>> > If a function is declared :keyword:`cpdef` it can be called from >>> > and overridden by both extension and normal python subclasses. >>> > >>> > I believe the reason that cpdef-ed enums and similar are currently >>> > i
[Cython] Python C-api ref count semantics
What is the rule of thumb when declaring functions from python's C-api when comes to ref counting? If I define a test case like so: cdef extern from "Python.h": object PyWeakref_NewRef(object, object) object PyWeakref_GET_OBJECT(object) class Foo(object): pass cdef class Test: cdef object obj cdef object wr def __init__(self): self.obj = Foo() self.wr = PyWeakref_NewRef(self.obj, None) def get_ref(self): return PyWeakref_GET_OBJECT(self.wr) I get these random python fatal errors: In [8]: %timeit -n 1000 t.get_ref() 1000 loops, best of 3: 224 ns per loop In [9]: %timeit -n 1000 t.get_ref() Fatal Python error: deallocating None Abort trap However, if I redefine the macro signature and getter function to this: from cpython cimport PyObject cdef extern from "Python.h": object PyWeakref_NewRef(object, object) PyObject* PyWeakref_GET_OBJECT(object) class Foo(object): pass cdef class Test: cdef object obj cdef object wr def __init__(self): self.obj = Foo() self.wr = PyWeakref_NewRef(self.obj, None) def clear_obj(self): self.obj = None def get_ref(self): return PyWeakref_GET_OBJECT(self.wr) Then it runs without issue. I can other gather is has to due the incref/decref going on in the generated C code. Should be doing something on my end to manually manage ref counts when using the C-Api? ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
Re: [Cython] [cython-users] Cython .pxd introspection: listing defined constants
On Thu, Feb 17, 2011 at 3:53 PM, Robert Bradshaw wrote: > On Thu, Feb 17, 2011 at 3:12 PM, W. Trevor King wrote: >> On Thu, Feb 17, 2011 at 01:25:10PM -0800, Robert Bradshaw wrote: >>> On Thu, Feb 17, 2011 at 5:29 AM, W. Trevor King wrote: >>> > On Wed, Feb 16, 2011 at 03:55:19PM -0800, Robert Bradshaw wrote: >>> >> On Wed, Feb 16, 2011 at 8:17 AM, W. Trevor King wrote: >>> >> > What I'm missing is a way to bind the ModuleScope namespace to a name >>> >> > in expose.pyx so that commands like `dir(mylib)` and `getattr(mylib, >>> >> > name)` will work in expose.pyx. >>> >> >>> >> You have also hit into the thorny issue that .pxd files are used for >>> >> many things. They may be pure C library declarations with no Python >>> >> module backing, they may be declarations of (externally implemented) >>> >> Python modules (such as numpy.pxd), or they may be declarations for >>> >> Cython-implemented modules. >>> >> >>> >> Here's another idea, what if extern blocks could contain cpdef >>> >> declarations, which would automatically generate a Python-level >>> >> wrappers for the declared members (if possible, otherwise an error)? >>> > >>> > Ah, this sounds good! Of the three .pxd roles you list above, >>> > external Python modules (e.g. numpy) and Cython-implemented modules >>> > (e.g. matched .pxd/.pyx) both already have a presence in Python-space. >>> > What's missing is a way to give (where possible) declarations of >>> > external C libraries a Python presence. cpdef fills this hole nicely, >>> > since its whole purpose is to expose Python interfaces to >>> > C-based elements. >>> >>> In the case of external Python modules, I'm not so sure we want to >>> monkey-patch our stuff in >> >> I don't think any of the changes we are suggesting would require >> changes to existing code, so .pxd-s with external implementations >> wouldn't be affected unless they brough the changes upon themselves. > > Say, in numpy.pxd, I have > > cdef extern from "...": >cpdef struct obscure_internal_struct: >... > > Do we add an "obscure_internal_struct" onto the (global) numpy module? > What if it conflicts with a (runtime) name? This is the issue I'm > bringing up. Defining a cpdef *and* a non-matching external implementation should raise a compile-time error. I agree that there is a useful distinction between external-C-library and external-Python-module .pxd wrappers. Perhaps your matching blank .py or .pyx file could serve as a marker that the .pxd file should be inflated into its own full fledged python module. I'm not even sure how you would go about adding attributes to the numpy module. When/how would the Cython-created attributes get added? In the external-C-library case, if you define something incompatible in the matching .pyx or .py file, Cython will be able to see it and die with an appropriate error, so you can resolve your programming mistake. If you try to override anything in a .so compiled module at runtime, you'd get the same kind of error you currently do trying to rebind a compiled class' method. >>> (and where would we do it--on the first import of a cimporting >>> module?) >> >> Compilation is an issue. I think that .pxd files should be able to be >> cythoned directly, since then they Cython can build any wrappers they >> request. If the file has a matching .pyx file, cythoning either one >> should compile both together, since they'll produce a single Python >> .so module. > > In this case, I think it may make more sense to consider cimporting > stuff from .pyx files if no .pxd file is present. Can you cimport .pyx files that lack matching .pxd files? > In any case, this is a big change. I don't like the idea of always > creating such a module (it may be empty, or have name conflicts) It shouldn't take to long to compile an empty module ;). And odds are noone will waste time importing it either. > nor the idea of conditionally compiling it (does one have to look at > the contents and really understand Cython to see if a Python shadow > will be created?) I agree here. Under the mantra "explicit is better than implicit", we could have users add something like cdef module "modname" to any .pxd files that should be inflated into Python modules. .pxd files without such a tag would receive the current treatment, error on any cpdef, etc. The drawback of this approach is that it makes Cython more complicated, but if both behaviors are reasonable, there's probably no getting around that. >>> > A side effect of this cpdef change would be that now even bare .pxd >>> > files (no matching .pyx) would have a Python presence, >>> >>> Where would it live? Would we just create this module (in essence, >>> acting as if there was an empty .pyx file sitting there as well)? On >>> this note, it may be worth pursuing the idea of a "cython helper" >>> module where common code and objects could live. >> >> I'm not sure exactly what you mean by "cython helper", but this sounds >> like my 'bare .py
[Cython] Gmane archive
Hi, I still didn't get a response from Gmane, but this article doesn't look promising: http://article.gmane.org/gmane.discuss/13987 So I guess we'll have to request a new group. Very unfortunate. Stefan ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
Re: [Cython] Python C-api ref count semantics
Chris Colbert, 18.02.2011 03:23: What is the rule of thumb when declaring functions from python's C-api when comes to ref counting? The general rule is to not declare them yourself. Instead, cimport them from the cpython package. (See Cython/Includes/) If I define a test case like so: cdef extern from "Python.h": object PyWeakref_NewRef(object, object) object PyWeakref_GET_OBJECT(object) class Foo(object): pass cdef class Test: cdef object obj cdef object wr def __init__(self): self.obj = Foo() self.wr = PyWeakref_NewRef(self.obj, None) def get_ref(self): return PyWeakref_GET_OBJECT(self.wr) I get these random python fatal errors: In [8]: %timeit -n 1000 t.get_ref() 1000 loops, best of 3: 224 ns per loop In [9]: %timeit -n 1000 t.get_ref() Fatal Python error: deallocating None Abort trap However, if I redefine the macro signature and getter function to this: from cpython cimport PyObject cdef extern from "Python.h": object PyWeakref_NewRef(object, object) PyObject* PyWeakref_GET_OBJECT(object) class Foo(object): pass cdef class Test: cdef object obj cdef object wr def __init__(self): self.obj = Foo() self.wr = PyWeakref_NewRef(self.obj, None) def clear_obj(self): self.obj = None def get_ref(self): returnPyWeakref_GET_OBJECT(self.wr) Then it runs without issue. I can other gather is has to due the incref/decref going on in the generated C code. Should be doing something on my end to manually manage ref counts when using the C-Api? Check the CPython documentation. Whenever a function returns a borrowed reference, you must declare it as PyObject* and cast it to . That being said, support for borrowed references has been long on the list but no-one has shown interest in doing it (or getting it done) so far. Stefan ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
Re: [Cython] [cython-users] Cython .pxd introspection: listing defined constants
Robert Bradshaw, 18.02.2011 00:54: Forgot reply-all... didin't we have this discussion before about making that the default for this list as it is by-far the most common desired behavior? Yes we did. And I guess it would be "the default" for mailing lists if it was just that: a default, not something that breaks replying. However, given that this has been discussed and decided, I'll just go and break replying for this list again. Stefan ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
Re: [Cython] Fixing NumPy support for Python 3 (Stefan, please help!)
Lisandro Dalcin, 17.02.2011 17:24: On 17 February 2011 12:16, Stefan Behnel wrote: Lisandro Dalcin, 17.02.2011 15:32: Stefan, what do you think about the patch below? This hunk is part of a series of fixes required to get numpy-dev working under Python 3.2. The root of the issue is that __cythonbufferdefaults__ keys&values end-up being "bytes" (this coercion is triggered in Interpreter.py). diff --git a/Cython/Compiler/ExprNodes.py b/Cython/Compiler/ExprNodes.py index 5b339da..b72deef 100755 --- a/Cython/Compiler/ExprNodes.py +++ b/Cython/Compiler/ExprNodes.py @@ -12,6 +12,7 @@ cython.declare(error=object, warning=object, warn_once=object, Builtin=object, Symtab=object, Utils=object, find_coercion_error debug_disposal_code=object, debug_temp_alloc=object, debug_coerc +import sys import operator from Errors import error, warning, warn_once, InternalError, CompileError @@ -1136,6 +1137,8 @@ class StringNode(PyConstNode): return self.result_code def compile_time_value(self, env): +if sys.version_info[0]>= 3 and self.unicode_value: You must use "self.unicode_value is not None" here, it may be the empty string. +return self.unicode_value return self.value Ok, that's a tricky one. Just because the compilation is running in Py3 doesn't mean that the correct compile time value is a Unicode string - we don't know what it'll be used for. OK, I've found an alternative workaround. What do you think? diff --git a/Cython/Compiler/Interpreter.py b/Cython/Compiler/Interpreter.py index 83cb184..9fb5fe5 100644 --- a/Cython/Compiler/Interpreter.py +++ b/Cython/Compiler/Interpreter.py @@ -6,6 +6,7 @@ For now this only covers parse tree to value conversion of compile-time values. """ +import sys from Nodes import * from ExprNodes import * from Errors import CompileError @@ -44,6 +45,10 @@ def interpret_compiletime_options(optlist, optdict, type_env=None, type_args=()) else: raise CompileError(node.pos, "Type not allowed here.") else: +if (sys.version_info[0]>=3 and +isinstance(node, StringNode) and +node.unicode_value is not None): +return (node.unicode_value, node.pos) return (node.compile_time_value(empty_scope), node.pos) if optlist: @@ -52,6 +57,7 @@ def interpret_compiletime_options(optlist, optdict, type_env=None, type_args=()) assert isinstance(optdict, DictNode) new_optdict = {} for item in optdict.key_value_pairs: -new_optdict[item.key.value] = interpret(item.value, item.key.value) +new_key, dummy = interpret(item.key, None) +new_optdict[new_key] = interpret(item.value, item.key.value) optdict = new_optdict return (optlist, new_optdict) This still isn't something that looks right. It just does the same thing at a different place. Actually, I'm not sure there is a way to "get it right". Doing the above will do the wrong thing e.g. in this case: DEF const_x = "abc" cdef str x = const_x The problem is: it is broken already, returning self.value is wrong because it drops available type information by returning plain bytes instead of str. And simply returning self.unicode_value in Py3 doesn't fix that. I see... So the correct compile time value for StringNode should depend on options.language_level, right? Hmmm. I'm not sure that would solve it. Py2 str has the property of changing type depending on the runtime environment. So this is actually independent of the language_level (-3 has easy semantics here). I mean, it won't even solve the problem at hand, because the code could still be Py2 but require a unicode string value because it gets *compiled* under Python 3. It shouldn't depend on the compile time environment at all, but the NumPy problem shows that it has to sometimes. I think we have to find a way to keep the double bytes/unicode string identity alive during runtime processing, up to the point where we can (or have to) decide what to make of it. Stefan ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
[Cython] del command
Hello, python 2.6.6 Cython-0.14.1 xp, mac, linux When I am using "del" command in my scripts and compiling using pure python mode, I am getting an error: "Deletion of local or C global name not supported" There are couple of places where I have to delete an object. I am using a hack to solve this but I am not sure it's correct way. In a simple python script (delobj.py), I am writing a function: def deleteObject(obj): del obj I am importing this script in all the scripts I am compiling using cython and there I am using: delobj.deleteObject(obj) this is working fine and no errors. Is this is the correct way? Regards Prashant ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel
Re: [Cython] del command
Reply sent to cython-users. Stefan ___ cython-devel mailing list cython-devel@python.org http://mail.python.org/mailman/listinfo/cython-devel