On Tue, Apr 9, 2013 at 10:24 PM, Stefan Behnel <stefan...@behnel.de> wrote:
> Nikita Nemkin, 10.04.2013 06:22:
>> Robert Bradshaw, 09.04.2013 20:16:
>>>>> Yep. We could even create stub .so files for the included ones that
>>>>> did nothing but import the big one to avoid having to worry about
>>>>> importing things in the right order.
>>
>> The stubs don't have to be shared libraries, pure python would work
>> just fine (and reduce the complexity).
>
> Sure. That's actually the most common way to do it. Most C modules in
> CPython's stdlib have a Python wrapper script, for example. It's also not
> uncommon to add a certain amount of functionality in that script. That
> makes even just "having it there" a feature for future development.
>
>
>>>>> The big question is what API to specify that things should be linked
>>>>> together. An option to cythonize(...) could be used, but for some
>>>>> projects (e.g. Sage) one would want more control over which modules
>>>>> get bundled together rather than all-or-nothing.
>>>>
>>>> No-one forces you to use a plain cythonize("**/*.pyx"). Just be more
>>>> specific about what files to include in each pattern that you pass to in.
>>>> And you can always call cythonize() more than once if you need to. Once for
>>>> each meta-module should usually be acceptable, given that the bundled
>>>> source modules would tend to have a common configuration anyway.
>>>
>>> That would still be painful. Ideally, users should never have to
>>> modify the setup.py file.
>>
>> Users have to plan for what to bundle, what the package structure
>> will be, etc. It is not an "enable and forget" type of thing.
>> (Unless you have an all-Cython package tree.)
>
> Right. And even in that case, you'd want to make sure that importing one
> little module doesn't instantiate loads of useless other modules. That
> would just kill your startup time and waste memory. So this is really about
> careful layout and planning.
>
> Maybe compiling whole packages isn't all that a good idea after all, unless
> at least most of the modules in that package are actually required for the
> core functionality.
>
>
>> I prefer explicitly creating "bundle" extensions with
>> distutil.core.Extension, passing multiple pyx files as sources,
>> then passing the result to cythonize.
>
> And I don't see why we should not require users to do it exactly like this.
> As you said, it's an explicit thing anyway, so making it explicit in the
> setup.py script seems like a good way to express it to me.
>
> Sage as a huge project might be an exception, but I'm sure there are not
> many projects of similar size out there. And even in Sage, ISTM that a
> single point of explicit packaging configuration would be better than
> implicit meta-package building inside of cythonize. In the worst case, it
> can always be externalised into some kind of .ini or JSON file and loaded
> from there in a generic way.
>
>
>> If you really want to push this into cythonize (it already does more
>> than it should IMO), one option is to add a new comment directive,
>> for example:
>> # cython: bundle = mymodule._speedups
>> similar to how distutils options are passed.
>> All .pyx files with the same "bundle" value will be put in that bundle.

Yeah, something like this might make sense. Though it's less natural
from a build perspective, it might be nicer to specify the dependency
the other way.

> I don't consider it a source level thing. It's a build time packaging
> thing, so it should be a build time decision, which makes setup.py the
> right place to express it.

With Sage it's a major pain to keep the explicit extension module list
in sync with the source tree, especially the (transitive) requirements
like C++ness and referenced libraries.

- Robert
_______________________________________________
cython-devel mailing list
cython-devel@python.org
http://mail.python.org/mailman/listinfo/cython-devel

Reply via email to