Thanks Simon, a few replies...

On Sat, Nov 12, 2011 at 6:14 AM, Simon Urbanek
<simon.urba...@r-project.org>wrote:

> Tyler,
>
> On Nov 11, 2011, at 7:55 PM, Tyler Pirtle wrote:
>
> > Hi,
> >
> > I've got a C extension structured roughly like:
> >
> > package/
> >  src/
> >    Makevars
> >    foo.c
> >    some-lib/...
> >    some-other-lib/..
> >
> > where foo.c and Makevars define dependencies on some-lib and
> > some-other-lib. Currently I'm having
> > Makevars configure;make install some-lib and some-other-lib into a local
> > build directory, which produces
> > shard libraries that ultimately I reference for foo.o in PKG_LIBS.
> >
> > I'm concerned about distribution. I've setup the appropriate magic with
> > rpath for the packages .so
>
> That is certainly non-portable and won't work for a vast majority of users.


Yea I figured, but apparently I have other, more pressing problems.. ;)



> > (meaning
> > that when the final .so is produced the dynamic libraries dependencies on
> > some-lib and some-other-lib
> > will prefer the location built in src/some-lib/... and
> > src/some-other-lib/... But does this preclude me from
> > being able to distribute a binary package?
>
> Yes. And I doubt the package will work the way you described it at all,
> because the "deep" .so won't be even installed. Also there are potential
> issues in multi-arch R (please consider testing that as well).
>
>
Understood. I wasn't a fan of any of the potential solutions I'd seen (one
of wich included source-only availability).
I've seen some other folks using the inst/ or data/ dirs for purposes like
this, but I agree it's ugly and has
issues. You raise a great point, too, about multi-arch R. I have potential
users that are definitely on
heterogeneous architectures, I noticed that when I R CMD INSTALL --build .
to check my current build,
I end up with a src-${ARCH} for both x86_64 and i386 - is there more
explicit multiarch testing I should be
doing?


>
> > If I do want to build a binary
> > distribution, is there a way I can
> > package up everything needed, not just the resulting .so?
> >
> > Or, are there better ways to bundle extension-specific third party
> > dependencies? ;) I'd rather not have
> > my users have to install obscure libraries globally on their systems.
> >
>
> Typically the best solution is to compile the dependencies as
> --disable-shared --enable-static --with-pic (in autoconf speak - you don't
> need to actually use autoconf). That way your .so has all its dependencies
> inside and you avoid all run-time hassle. Note that it is very unlikely
> that you can take advantage of the dynamic nature of the dependencies
> (since no one else knows about them anyway) so there is not real point to
> build them dynamically.
>
>
That is a much better solution and the one I've been looking for! I was
afraid I'd have to manually specific all the dependency objects but if I
just disable
shared than that makes much more sense, I can let the compiler and linker
do the work for me.


> Also note that typically you want to use the package-level configure to
> run subconfigures, and *not* Makevars. (There may be reasons for an
> exception to that convention, but you need to be aware of the differences
> in multi-arch builds since Makevars builds all architectures at once from
> separate copies of the src directories whereas the presence of configure
> allows you to treat your package as one architecture at a time and you can
> pass-though parameters).
>
>
Understood. Is src/ still the appropriate directory then for my third party
packages? Also, do you happen to know of any packages off-hand that I can
use
as a reference?

Thanks Simon! Your insights here are invaluable. I really appreciate it.



Tyler




> Cheers,
> Simon
>
>
>

        [[alternative HTML version deleted]]

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to