[Rd] Including multiple third party libraries in an extension

2011-11-12 Thread Tyler Pirtle
Hi,

I've got a C extension structured roughly like:

package/
  src/
Makevars
foo.c
some-lib/...
some-other-lib/..

where foo.c and Makevars define dependencies on some-lib and
some-other-lib. Currently I'm having
Makevars configure;make install some-lib and some-other-lib into a local
build directory, which produces
shard libraries that ultimately I reference for foo.o in PKG_LIBS.

I'm concerned about distribution. I've setup the appropriate magic with
rpath for the packages .so (meaning
that when the final .so is produced the dynamic libraries dependencies on
some-lib and some-other-lib
will prefer the location built in src/some-lib/... and
src/some-other-lib/... But does this preclude me from
being able to distribute a binary package? If I do want to build a binary
distribution, is there a way I can
package up everything needed, not just the resulting .so?

Or, are there better ways to bundle extension-specific third party
dependencies? ;) I'd rather not have
my users have to install obscure libraries globally on their systems.


Thanks!


Tyler

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Including multiple third party libraries in an extension

2011-11-12 Thread Tyler Pirtle
Thanks Simon, a few replies...

On Sat, Nov 12, 2011 at 6:14 AM, Simon Urbanek
wrote:

> Tyler,
>
> On Nov 11, 2011, at 7:55 PM, Tyler Pirtle wrote:
>
> > Hi,
> >
> > I've got a C extension structured roughly like:
> >
> > package/
> >  src/
> >Makevars
> >foo.c
> >some-lib/...
> >some-other-lib/..
> >
> > where foo.c and Makevars define dependencies on some-lib and
> > some-other-lib. Currently I'm having
> > Makevars configure;make install some-lib and some-other-lib into a local
> > build directory, which produces
> > shard libraries that ultimately I reference for foo.o in PKG_LIBS.
> >
> > I'm concerned about distribution. I've setup the appropriate magic with
> > rpath for the packages .so
>
> That is certainly non-portable and won't work for a vast majority of users.


Yea I figured, but apparently I have other, more pressing problems.. ;)



> > (meaning
> > that when the final .so is produced the dynamic libraries dependencies on
> > some-lib and some-other-lib
> > will prefer the location built in src/some-lib/... and
> > src/some-other-lib/... But does this preclude me from
> > being able to distribute a binary package?
>
> Yes. And I doubt the package will work the way you described it at all,
> because the "deep" .so won't be even installed. Also there are potential
> issues in multi-arch R (please consider testing that as well).
>
>
Understood. I wasn't a fan of any of the potential solutions I'd seen (one
of wich included source-only availability).
I've seen some other folks using the inst/ or data/ dirs for purposes like
this, but I agree it's ugly and has
issues. You raise a great point, too, about multi-arch R. I have potential
users that are definitely on
heterogeneous architectures, I noticed that when I R CMD INSTALL --build .
to check my current build,
I end up with a src-${ARCH} for both x86_64 and i386 - is there more
explicit multiarch testing I should be
doing?


>
> > If I do want to build a binary
> > distribution, is there a way I can
> > package up everything needed, not just the resulting .so?
> >
> > Or, are there better ways to bundle extension-specific third party
> > dependencies? ;) I'd rather not have
> > my users have to install obscure libraries globally on their systems.
> >
>
> Typically the best solution is to compile the dependencies as
> --disable-shared --enable-static --with-pic (in autoconf speak - you don't
> need to actually use autoconf). That way your .so has all its dependencies
> inside and you avoid all run-time hassle. Note that it is very unlikely
> that you can take advantage of the dynamic nature of the dependencies
> (since no one else knows about them anyway) so there is not real point to
> build them dynamically.
>
>
That is a much better solution and the one I've been looking for! I was
afraid I'd have to manually specific all the dependency objects but if I
just disable
shared than that makes much more sense, I can let the compiler and linker
do the work for me.


> Also note that typically you want to use the package-level configure to
> run subconfigures, and *not* Makevars. (There may be reasons for an
> exception to that convention, but you need to be aware of the differences
> in multi-arch builds since Makevars builds all architectures at once from
> separate copies of the src directories whereas the presence of configure
> allows you to treat your package as one architecture at a time and you can
> pass-though parameters).
>
>
Understood. Is src/ still the appropriate directory then for my third party
packages? Also, do you happen to know of any packages off-hand that I can
use
as a reference?

Thanks Simon! Your insights here are invaluable. I really appreciate it.



Tyler




> Cheers,
> Simon
>
>
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Including multiple third party libraries in an extension

2011-11-12 Thread Tyler Pirtle
On Sat, Nov 12, 2011 at 8:08 PM, Tyler Pirtle  wrote:

> Thanks Simon, a few replies...
>
> On Sat, Nov 12, 2011 at 6:14 AM, Simon Urbanek <
> simon.urba...@r-project.org> wrote:
>
>> Tyler,
>>
>> On Nov 11, 2011, at 7:55 PM, Tyler Pirtle wrote:
>>
>> > Hi,
>> >
>> > I've got a C extension structured roughly like:
>> >
>> > package/
>> >  src/
>> >Makevars
>> >foo.c
>> >some-lib/...
>> >some-other-lib/..
>> >
>> > where foo.c and Makevars define dependencies on some-lib and
>> > some-other-lib. Currently I'm having
>> > Makevars configure;make install some-lib and some-other-lib into a local
>> > build directory, which produces
>> > shard libraries that ultimately I reference for foo.o in PKG_LIBS.
>> >
>> > I'm concerned about distribution. I've setup the appropriate magic with
>> > rpath for the packages .so
>>
>> That is certainly non-portable and won't work for a vast majority of
>> users.
>
>
> Yea I figured, but apparently I have other, more pressing problems.. ;)
>
>
>
>> > (meaning
>> > that when the final .so is produced the dynamic libraries dependencies
>> on
>> > some-lib and some-other-lib
>> > will prefer the location built in src/some-lib/... and
>> > src/some-other-lib/... But does this preclude me from
>> > being able to distribute a binary package?
>>
>> Yes. And I doubt the package will work the way you described it at all,
>> because the "deep" .so won't be even installed. Also there are potential
>> issues in multi-arch R (please consider testing that as well).
>>
>>
> Understood. I wasn't a fan of any of the potential solutions I'd seen (one
> of wich included source-only availability).
> I've seen some other folks using the inst/ or data/ dirs for purposes like
> this, but I agree it's ugly and has
> issues. You raise a great point, too, about multi-arch R. I have potential
> users that are definitely on
> heterogeneous architectures, I noticed that when I R CMD INSTALL --build .
> to check my current build,
> I end up with a src-${ARCH} for both x86_64 and i386 - is there more
> explicit multiarch testing I should be
> doing?
>
>
>>
>> > If I do want to build a binary
>> > distribution, is there a way I can
>> > package up everything needed, not just the resulting .so?
>> >
>> > Or, are there better ways to bundle extension-specific third party
>> > dependencies? ;) I'd rather not have
>> > my users have to install obscure libraries globally on their systems.
>> >
>>
>> Typically the best solution is to compile the dependencies as
>> --disable-shared --enable-static --with-pic (in autoconf speak - you don't
>> need to actually use autoconf). That way your .so has all its dependencies
>> inside and you avoid all run-time hassle. Note that it is very unlikely
>> that you can take advantage of the dynamic nature of the dependencies
>> (since no one else knows about them anyway) so there is not real point to
>> build them dynamically.
>>
>>
> That is a much better solution and the one I've been looking for! I was
> afraid I'd have to manually specific all the dependency objects but if I
> just disable
> shared than that makes much more sense, I can let the compiler and linker
> do the work for me.
>
>
>> Also note that typically you want to use the package-level configure to
>> run subconfigures, and *not* Makevars. (There may be reasons for an
>> exception to that convention, but you need to be aware of the differences
>> in multi-arch builds since Makevars builds all architectures at once from
>> separate copies of the src directories whereas the presence of configure
>> allows you to treat your package as one architecture at a time and you can
>> pass-though parameters).
>>
>>
> Understood. Is src/ still the appropriate directory then for my third
> party packages? Also, do you happen to know of any packages off-hand that I
> can use
> as a reference?
>
> Thanks Simon! Your insights here are invaluable. I really appreciate it.
>
>
>
> Tyler
>
>

Ah, also a few more questions...

I don't really understand the flow for developing multi-arch extensions.
Does configure run only once? Once per arch? What is the state of
src-${ARCH} by the time the src/Makevars or Makefile is executed? Is any of
this actually in the manual and am I just missing it? ;)


And why does R_ARCH start with a '/'? ;)

thanks again!


Tyler



>
>
>
>> Cheers,
>> Simon
>>
>>
>>
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Including multiple third party libraries in an extension

2011-11-13 Thread Tyler Pirtle
On Sun, Nov 13, 2011 at 7:27 AM, Uwe Ligges  wrote:

>
>
> On 13.11.2011 05:22, Tyler Pirtle wrote:
>
>> On Sat, Nov 12, 2011 at 8:08 PM, Tyler Pirtle  wrote:
>>
>>  Thanks Simon, a few replies...
>>>
>>> On Sat, Nov 12, 2011 at 6:14 AM, Simon Urbanek<
>>> simon.urba...@r-project.org>  wrote:
>>>
>>>  Tyler,
>>>>
>>>> On Nov 11, 2011, at 7:55 PM, Tyler Pirtle wrote:
>>>>
>>>>  Hi,
>>>>>
>>>>> I've got a C extension structured roughly like:
>>>>>
>>>>> package/
>>>>>  src/
>>>>>Makevars
>>>>>foo.c
>>>>>some-lib/...
>>>>>some-other-lib/..
>>>>>
>>>>> where foo.c and Makevars define dependencies on some-lib and
>>>>> some-other-lib. Currently I'm having
>>>>> Makevars configure;make install some-lib and some-other-lib into a
>>>>> local
>>>>> build directory, which produces
>>>>> shard libraries that ultimately I reference for foo.o in PKG_LIBS.
>>>>>
>>>>> I'm concerned about distribution. I've setup the appropriate magic with
>>>>> rpath for the packages .so
>>>>>
>>>>
>>>> That is certainly non-portable and won't work for a vast majority of
>>>> users.
>>>>
>>>
>>>
>>> Yea I figured, but apparently I have other, more pressing problems.. ;)
>>>
>>>
>>>
>>>  (meaning
>>>>> that when the final .so is produced the dynamic libraries dependencies
>>>>>
>>>> on
>>>>
>>>>> some-lib and some-other-lib
>>>>> will prefer the location built in src/some-lib/... and
>>>>> src/some-other-lib/... But does this preclude me from
>>>>> being able to distribute a binary package?
>>>>>
>>>>
>>>> Yes. And I doubt the package will work the way you described it at all,
>>>> because the "deep" .so won't be even installed. Also there are potential
>>>> issues in multi-arch R (please consider testing that as well).
>>>>
>>>>
>>>>  Understood. I wasn't a fan of any of the potential solutions I'd seen
>>> (one
>>> of wich included source-only availability).
>>> I've seen some other folks using the inst/ or data/ dirs for purposes
>>> like
>>> this, but I agree it's ugly and has
>>> issues. You raise a great point, too, about multi-arch R. I have
>>> potential
>>> users that are definitely on
>>> heterogeneous architectures, I noticed that when I R CMD INSTALL --build
>>> .
>>> to check my current build,
>>> I end up with a src-${ARCH} for both x86_64 and i386 - is there more
>>> explicit multiarch testing I should be
>>> doing?
>>>
>>>
>>>
>>>>  If I do want to build a binary
>>>>> distribution, is there a way I can
>>>>> package up everything needed, not just the resulting .so?
>>>>>
>>>>> Or, are there better ways to bundle extension-specific third party
>>>>> dependencies? ;) I'd rather not have
>>>>> my users have to install obscure libraries globally on their systems.
>>>>>
>>>>>
>>>> Typically the best solution is to compile the dependencies as
>>>> --disable-shared --enable-static --with-pic (in autoconf speak - you
>>>> don't
>>>> need to actually use autoconf). That way your .so has all its
>>>> dependencies
>>>> inside and you avoid all run-time hassle. Note that it is very unlikely
>>>> that you can take advantage of the dynamic nature of the dependencies
>>>> (since no one else knows about them anyway) so there is not real point
>>>> to
>>>> build them dynamically.
>>>>
>>>>
>>>>  That is a much better solution and the one I've been looking for! I was
>>> afraid I'd have to manually specific all the dependency objects but if I
>>> just disable
>>> shared than that makes much more sense, I can let the compiler and linker
>>> do the work for me.
>>>
>>>
>>>  Also note that typically you want to use the package-level configure to
>>>> run subconfigures, and *not* Makevars. (There may

Re: [Rd] Including multiple third party libraries in an extension

2011-11-13 Thread Tyler Pirtle
On Sun, Nov 13, 2011 at 6:25 PM, Simon Urbanek
wrote:

>
> On Nov 13, 2011, at 6:48 PM, Tyler Pirtle wrote:
>
> >
> >
> > On Sun, Nov 13, 2011 at 7:27 AM, Uwe Ligges <
> lig...@statistik.tu-dortmund.de> wrote:
> >
> >
> > On 13.11.2011 05:22, Tyler Pirtle wrote:
> > On Sat, Nov 12, 2011 at 8:08 PM, Tyler Pirtle  wrote:
> >
> > Thanks Simon, a few replies...
> >
> > On Sat, Nov 12, 2011 at 6:14 AM, Simon Urbanek<
> > simon.urba...@r-project.org>  wrote:
> >
> > Tyler,
> >
> > On Nov 11, 2011, at 7:55 PM, Tyler Pirtle wrote:
> >
> > Hi,
> >
> > I've got a C extension structured roughly like:
> >
> > package/
> >  src/
> >Makevars
> >foo.c
> >some-lib/...
> >some-other-lib/..
> >
> > where foo.c and Makevars define dependencies on some-lib and
> > some-other-lib. Currently I'm having
> > Makevars configure;make install some-lib and some-other-lib into a local
> > build directory, which produces
> > shard libraries that ultimately I reference for foo.o in PKG_LIBS.
> >
> > I'm concerned about distribution. I've setup the appropriate magic with
> > rpath for the packages .so
> >
> > That is certainly non-portable and won't work for a vast majority of
> > users.
> >
> >
> > Yea I figured, but apparently I have other, more pressing problems.. ;)
> >
> >
> >
> > (meaning
> > that when the final .so is produced the dynamic libraries dependencies
> > on
> > some-lib and some-other-lib
> > will prefer the location built in src/some-lib/... and
> > src/some-other-lib/... But does this preclude me from
> > being able to distribute a binary package?
> >
> > Yes. And I doubt the package will work the way you described it at all,
> > because the "deep" .so won't be even installed. Also there are potential
> > issues in multi-arch R (please consider testing that as well).
> >
> >
> > Understood. I wasn't a fan of any of the potential solutions I'd seen
> (one
> > of wich included source-only availability).
> > I've seen some other folks using the inst/ or data/ dirs for purposes
> like
> > this, but I agree it's ugly and has
> > issues. You raise a great point, too, about multi-arch R. I have
> potential
> > users that are definitely on
> > heterogeneous architectures, I noticed that when I R CMD INSTALL --build
> .
> > to check my current build,
> > I end up with a src-${ARCH} for both x86_64 and i386 - is there more
> > explicit multiarch testing I should be
> > doing?
> >
> >
> >
> > If I do want to build a binary
> > distribution, is there a way I can
> > package up everything needed, not just the resulting .so?
> >
> > Or, are there better ways to bundle extension-specific third party
> > dependencies? ;) I'd rather not have
> > my users have to install obscure libraries globally on their systems.
> >
> >
> > Typically the best solution is to compile the dependencies as
> > --disable-shared --enable-static --with-pic (in autoconf speak - you
> don't
> > need to actually use autoconf). That way your .so has all its
> dependencies
> > inside and you avoid all run-time hassle. Note that it is very unlikely
> > that you can take advantage of the dynamic nature of the dependencies
> > (since no one else knows about them anyway) so there is not real point to
> > build them dynamically.
> >
> >
> > That is a much better solution and the one I've been looking for! I was
> > afraid I'd have to manually specific all the dependency objects but if I
> > just disable
> > shared than that makes much more sense, I can let the compiler and linker
> > do the work for me.
> >
> >
> > Also note that typically you want to use the package-level configure to
> > run subconfigures, and *not* Makevars. (There may be reasons for an
> > exception to that convention, but you need to be aware of the differences
> > in multi-arch builds since Makevars builds all architectures at once from
> > separate copies of the src directories whereas the presence of configure
> > allows you to treat your package as one architecture at a time and you
> can
> > pass-though parameters).
> >
> >
> > Understood. Is src/ still the appropriate directory then for my third
> > party packages? Also, do you happen to know of any packages off-hand
> that I
> > can use
> > a

[Rd] unserialize and eager execution

2011-12-05 Thread Tyler Pirtle
Hi,

While debugging a network server I'm developing I noticed something unusual
- call to unserialize() resulted in
an error about loading a namespace.

I was a bit taken back by this - why should unserializing an object cause a
namespace lookup?
Are there any other side-effects of unserialize() that I should be cautious
about? I've been
digging through the R_Unserialize() call, I haven't found the loadNamespace
bit yet but I
assume its in there somewhere.

Is there anyway to guard against R eagerly evaluating serialized data
(serialize()) being unserialized (unserialize()) ?


Thanks,


Tyler

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] unserialize and eager execution

2011-12-05 Thread Tyler Pirtle
On Mon, Dec 5, 2011 at 10:41 PM, Prof Brian Ripley wrote:

> Take a look at the information on serialization in 'R Internals'. AFAICS
> this is no different from what can happen when loading a saved workspace.
>
>
I'll give that a look, thanks.


>
> On 06/12/2011 05:29, Tyler Pirtle wrote:
>
>> Hi,
>>
>> While debugging a network server I'm developing I noticed something
>> unusual
>> - call to unserialize() resulted in
>> an error about loading a namespace.
>>
>> I was a bit taken back by this - why should unserializing an object cause
>> a
>> namespace lookup?
>> Are there any other side-effects of unserialize() that I should be
>> cautious
>> about? I've been
>> digging through the R_Unserialize() call, I haven't found the
>> loadNamespace
>> bit yet but I
>> assume its in there somewhere.
>>
>> Is there anyway to guard against R eagerly evaluating serialized data
>> (serialize()) being unserialized (unserialize()) ?
>>
>
> Don't unserialize 'eagerly'.  Hint: that's what lazy-loading does.
>
>

I don't follow you. Could you elaborate?


Thanks,


T



>
>>
>> Thanks,
>>
>>
>> Tyler
>>
>>[[alternative HTML version deleted]]
>>
>> __**
>> R-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/**listinfo/r-devel<https://stat.ethz.ch/mailman/listinfo/r-devel>
>>
>
>
> --
> Brian D. Ripley,  rip...@stats.ox.ac.uk
> Professor of Applied Statistics,  
> http://www.stats.ox.ac.uk/~**ripley/<http://www.stats.ox.ac.uk/~ripley/>
> University of Oxford, Tel:  +44 1865 272861 (self)
> 1 South Parks Road, +44 1865 272866 (PA)
> Oxford OX1 3TG, UKFax:  +44 1865 272595
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Ghost RefClasses

2012-05-16 Thread Tyler Pirtle
Hi there, this seems weird to me. Perhaps someone can explain whats going
on?

I'm creating a RefClass, then unloading the methods package, then
re-loading it, declaring
a different refclass with the same name, loading that one, but i'm getting
instances of the
previously defined class.


$ R

R version 2.15.0 (2012-03-30)
Copyright (C) 2012 The R Foundation for Statistical Computing
ISBN 3-900051-07-0
Platform: x86_64-apple-darwin9.8.0/x86_64 (64-bit)


> # Make a function that returns a RefClass.
> boot <- function() {
+   library(methods)
+   setRefClass("someclass", fields = list(a = "numeric"),
+   methods = list(
+ addSome = function(){
+   a <<- a+1
+   a
+ }))
+ }
> j <- boot()
> s <- j$new(a=1)
> s$addSome()
> # Unload (detach) methods.
> detach("package:methods", character.only=TRUE, force=TRUE, unload=TRUE)
unloading 'methods' package ...
> search()
[1] ".GlobalEnv""package:stats" "package:graphics"
[4] "package:grDevices" "package:utils" "package:datasets"
[7] "Autoloads" "package:base"
> # Define a new class, load methods.
> boot2 <- function() {
+   library(methods)
+   setRefClass("someclass", fields = list(a = "numeric"),
+   methods = list(
+ addSome = function(){
+   a <<- a+3
+   a
+ }))
+ }
> j <- boot2()
> s <- j$new(a=1)
> s$addSome()
[1] 2

I'd expect the last call to addSome to be 4, not 2.

Thanks,



Tyler

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel