Re: [Rd] Using gcc4 visibility features

2006-01-05 Thread elijah wright

> Subject: [Rd] Using gcc4 visibility features
> 
> R-devel now makes use of gcc4's visibility features: for an in-depth 
> account see
>
> http://people.redhat.com/drepper/dsohowto.pdf


does this mean that we now have a dependency on gcc4, or just that it 
"can" use the feature of gcc4?

clarification, please.

thanks,

--elijah

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Using gcc4 visibility features

2006-01-05 Thread elijah wright

>>> R-devel now makes use of gcc4's visibility features: for
>>> an in-depth account see
>
>elijah> does this mean that we now have a dependency on
>elijah> gcc4, or just that it "can" use the feature of gcc4?
>
> the latter (of course!)


that's what i expected, i was just checking :)

--elijah

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Provide both shlib and standard versions of R?

2006-01-15 Thread elijah wright

>> That is not true, almost all binaries come with R as shared library - 
>> it is in fact the default on Mac OS X and Windows. Most Linux 
>> distributions provide a shared library binary as well.
>
> This would be good news. But at least, under linux,
>
> ./configure --help
>  --enable-R-shlibbuild the shared/dynamic library 'libR' [no]
>
> This option is not enabled by default.

then either build your own with correct options or talk to your 
distribution's packaging team.

on debian:

[EMAIL PROTECTED]:/usr/lib/R/lib$ ls -al /usr/lib/R/lib
total 2900
drwxr-xr-x  2 root root4096 Jan 10 20:54 .
drwxr-xr-x 11 root root4096 Jan 10 20:54 ..
-rw-r--r--  1 root root 1810072 Jan  7 20:44 libR.so
-rw-r--r--  1 root root 1139796 Jan  7 20:44 libRlapack.so
lrwxrwxrwx  1 root root  27 Jun  8  2005 libggobi.so -> 
../../ggobi/lib/libggobi.so
lrwxrwxrwx  1 root root  28 Jun  8  2005 libgtkext.so -> 
../../ggobi/lib/libgtkext.so


as you can see - there's clearly a nice libR.so sitting here.

--elijah

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Building R under Linux - library dependencies

2016-09-07 Thread elijah wright
On Wed, Sep 7, 2016 at 1:50 PM, Paweł Piątkowski  wrote:

> > | Is there a way to overcome this problem? Precompiled versions of R can
> be installed on various system configurations, so I guess that there should
> be a way to compile it in a version-agnostic manner.
> >
> > Yes, for example by
> >
> >   -- using a Docker container which is portable across OSs (!!) and
> versions
>
> Docker R containers are north of 250 MB. I have checked experimentally
> that you can trim R down to 16 MB (!) and you'll still be able to execute
> it (though with warnings). That *is* quite a difference, especially when
> deploying small applications.



... I would guesstimate the libraries required to run R with any useful set
of libraries is quite a bit bigger than the cited 16M ...



> >   -- relying on package management which is what every Linux distro does
> >
> > (...)
> >
> > PS For the latter point, our .deb based R package currently shows this:
> >
> > (...)
> >
> > Depends: zip, unzip, libpaper-utils, xdg-utils, libblas3 | libblas.so.3,
> libbz2-1.0, libc6 (>= 2.23), libcairo2 (>= 1.6.0), libcurl3 (>= 7.28.0),
> libglib2.0-0 (>= 2.12.0), libgomp1 (>= 4.9), libjpeg8 (>= 8c), liblapack3 |
> liblapack.so.3, liblzma5 (>= 5.1.1alpha+20120614), libpango-1.0-0 (>=
> 1.14.0), libpangocairo-1.0-0 (>= 1.14.0), libpcre3, libpng12-0 (>=
> 1.2.13-4), libreadline6 (>= 6.0), libtcl8.6 (>= 8.6.0), libtiff5 (>=
> 4.0.3), libtk8.6 (>= 8.6.0), libx11-6, libxt6, zlib1g (>= 1:1.1.4), ucf (>=
> 3.0), ca-certificates
>
> Sure, package dependencies would be great as well - at least you'd be sure
> that users of, say, Debian-based distros will be able to run this portable
> R, as long as they've installed the required libraries. But notice that in
> your example package versions equal *or greater* than listed are required -
> so if someone has upgraded their system, they still will be able to run
> that R. With a version built from source you need *exactly* the same
> version as on the machine where R was compiled. Hence my question: how come
> the precompiled distribution of R has "less strict" library requirements
> than manually compiled versions?
>

Package managers don't usually cite 'less than' versions for packages -
because how do you assert a version that won't work when it hasn't been
released yet?

You could go on a tear and build statically linked versions of
R-with-everything-you-need, and maybe avoid the library madness... but this
is sort of a fool's errand and a huge consumer of time.  OS vendors and
compiler developers have stopped doing things that way for reasons it's
much simpler to reduce duplication and make everything work - while
allowing for patching out security issues - when you are *just slightly*
more flexible.

ABI compatibility and library versioning are, I think, fairly well
understood

Doing this stuff with a container is very much the easiest route, if you
actually want it to be completely portable.  You're certainly welcome to
start with an Alpine Linux base and add R on top and then start paring...
but I start to not understand the point, somewhere in there  it's a lot
of time spent on something that doesn't seem that beneficial when you've
got (even fairly reasonably modern) hardware that can deal with a tiny bit
of extra bloat.  SD cards and USB sticks are pretty cheap everywhere, now,
aren't they?

I could say, maybe, putting time into this as some kind of retrocomputing
project... but probably not otherwise.

best,

--e

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Re: [Rd] On implementing zero-overhead code reuse

2016-10-04 Thread elijah wright
Shower thoughts:

Are you digging for something like what you'd use with a CI/CD pipeline?
 e.g. - building a workflow that pulls a tag from a couple of code
repositories, checks them out into a workspace, installs prereqs, and then
runs your code/tasks in a repeatable fashion?

I'm not aware of a thing that is like a Gemfile or a Berksfile or a
package.json for R - but you can surely approximate that with a job step
that runs install.packages from a snippet of R code.

[I did have a quick glance at the install.packages docs to refresh my
memory -- it looks like it's biased toward installing the latest *unless*
you point it at something like an archive that has your package selections
frozen in time.  You can either store the deps yourself, or find an archive
that has historical snapshots by-date?  I would expect, really, that the
CRAN packages are unlikely to suddenly stop being version controlled or for
their history to vanish into the ether   Maybe someone stores zfs
snapshots or similar of CRAN, on a date-by-date basis?  It should be cheap
(disk wise) to do...]

In my ideal world 'newer packages should mean more accurate results' --
running code with older package versions should mean that you're
duplicating the errors which to me seems not-useful in most cases

best,

--e


On Mon, Oct 3, 2016 at 6:24 PM, Kynn Jones  wrote:

> Martin, thanks for that example.  It's definitely eye-opening, and
> very good to know.
>
> The installation business, however, is still a killer for me.  Of
> course, it's a trivial step in a simple example like the one you
> showed.  But consider this scenario:  suppose I perform an analysis
> that I may publish in the future, so I commit the project's state at
> the time of the analysis, and tag the commit with the KEEPER tag.
> Several months later, I want to repeat that exact analysis for some
> whatever reason.  If the code for the analysis was in Python (say),
> all I need to do is this (at the Unix command line):
>
> % git checkout KEEPER
> % python src/python/go_to_town.py
>
> ...knowing that the `git checkout KEEPER` command, *all by itself*,
> has put the working directory in the state I want it to be before I
> re-do the analysis.
>
> AFAICT, if the code for the analysis was in R, then `git checkout`, by
> itself, would *not* put the working directory in the desired state.  I
> still need to re-install all the R libraries in the repo.  And I
> better not forget to do this re-installation, otherwise I will end up
> running code different from the one I thought I was running.  (I find
> this prospect horrifying, for some reason.)
>
> A similar need to re-install stuff would arise whenever I update the repo.
>
> Please correct me if I'm wrong.
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] how useful could be a fast and embedded database for the R community?

2014-12-24 Thread elijah wright
I believe in patches and working code.

You're proposing to compete with the likes of sqlite and berkeley db
-- not small competition, with excellent performance characteristics
when used properly.

You also used the 'b'illions word in reference to data sets - really?

best,

--e


On Tue, Dec 23, 2014 at 11:31 AM, joanv  wrote:
> Dear all,
>
> I'm developing a new database with the ability to perform very fast seek,
> insert, and delete operations. Also is able to perform very fast comparison
> of datasets. It has been designed to work embedded into other programs
> programmed in R, Fortran, C++, etc.
>
> It can manage efficiently billions of numeric datasets in a single machine.
>
> Right now I do not know in what fields of the R community could be helpful
> such a database, or if there could be a need of such a capability in the R
> community.
>
> Could someone help me in this topic? Partners for the project are also
> wanted, specially R experts, or experts on other kinds of calculation
> programs (vasp, gaussian, etc... )
>
> Regards and thank you.
>
>
>
>
> --
> View this message in context: 
> http://r.789695.n4.nabble.com/how-useful-could-be-a-fast-and-embedded-database-for-the-R-community-tp4701051.html
> Sent from the R devel mailing list archive at Nabble.com.
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Compilation failure on Solaris: any advice?

2012-12-05 Thread elijah wright
Is that a build with "good old" Studio or a build with a recent GCC?

I don't have any direct comments that would be helpful to you - but let me
know if you need a place to do some test builds and try to figure it out.
 I can certainly help you with that.

[Are more Solaris-esque build slaves needed?  Someone give a shout if so...
we can sponsor some infrastructure there.]

--elijah
(@Joyent)



On Mon, Dec 3, 2012 at 11:28 AM, Jon Clayden  wrote:

> Dear all,
>
> The current version of my RNiftyReg package is failing to compile on CRAN's
> Solaris testbed, but I don't have access to a Solaris system to debug on,
> and Googling the error hasn't been very helpful. The error is
>
> CC -library=stlport4 -I/home/ripley/R/cc/include -DNDEBUG -DNDEBUG
> -DRNIFTYREG -I/usr/local/include-KPIC  -O -xlibmil -xtarget=native
> -nofstore  -c niftyreg.cpp -o niftyreg.o
> "_reg_f3d_sym.cpp", line 25: Error: reg_f3d may not have a type qualifier.
> "niftyreg.cpp", line 527: Where: While instantiating
> "reg_f3d_sym::reg_f3d_sym(int, int)".
> "niftyreg.cpp", line 527: Where: Instantiated from non-template code.
> "_reg_f3d_sym.cpp", line 26: Error: reg_f3d cannot be initialized
> in a constructor.
> "niftyreg.cpp", line 527: Where: While instantiating
> "reg_f3d_sym::reg_f3d_sym(int, int)".
> "niftyreg.cpp", line 527: Where: Instantiated from non-template code.
> "_reg_f3d_sym.cpp", line 26: Error: Could not find
> reg_f3d::reg_f3d() to initialize base class.
> "niftyreg.cpp", line 527: Where: While instantiating
> "reg_f3d_sym::reg_f3d_sym(int, int)".
> "niftyreg.cpp", line 527: Where: Instantiated from non-template code.
> 3 Error(s) detected.
> *** Error code 2
> make: Fatal error: Command failed for target `niftyreg.o'
>
>
> (Full log at [1].) The relevant part of the source is a C++ class
> constructor, part of the library that my package interfaces to:
>
> template 
> reg_f3d_sym::reg_f3d_sym(int refTimePoint,int floTimePoint)
> :reg_f3d::reg_f3d(refTimePoint,floTimePoint)
> {
> this->executableName=(char *)"NiftyReg F3D SYM";
>
> this->backwardControlPointGrid=NULL;
> this->backwardWarped=NULL;
> this->backwardWarpedGradientImage=NULL;
> this->backwardDeformationFieldImage=NULL;
> this->backwardVoxelBasedMeasureGradientImage=NULL;
> this->backwardNodeBasedGradientImage=NULL;
>
> this->backwardBestControlPointPosition=NULL;
> this->backwardConjugateG=NULL;
> this->backwardConjugateH=NULL;
>
> this->backwardProbaJointHistogram=NULL;
> this->backwardLogJointHistogram=NULL;
>
> this->floatingMaskImage=NULL;
> this->currentFloatingMask=NULL;
> this->floatingMaskPyramid=NULL;
> this->backwardActiveVoxelNumber=NULL;
>
> this->inverseConsistencyWeight=0.1;
>
> #ifndef NDEBUG
> printf("[NiftyReg DEBUG] reg_f3d_sym constructor called\n");
> #endif
> }
>
> The error does not occur on any Windows, Linux or OS X system which I have
> access to, so this would seem to be an issue relating to the Solaris
> compiler toolchain in particular. Can anyone shed any light on it, please?
>
> Thanks in advance,
> Jon
>
> --
> [1]
>
> http://www.r-project.org/nosvn/R.check/r-patched-solaris-x86/RNiftyReg-00install.html
>
> [[alternative HTML version deleted]]
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Compilation failure on Solaris: any advice?

2012-12-06 Thread elijah wright
On Thu, Dec 6, 2012 at 9:34 AM, Dirk Eddelbuettel  wrote:

| > [Are more Solaris-esque build slaves needed?  Someone give a shout
> if so...
> | > we can sponsor some infrastructure there.]
> |
> | as long we (R community at large "in principle", de facto,
> |   AFAIK, Prof Brian Ripley)
> | can have a Solaris testbed for CRAN which is reflected in the
> | CRAN package check, it would be really great if someone provided
> | a "Solaris-builder" server similar to the
> | win-builder.r-project.org one that Uwe (and his Depaertment/
> | University) provides.
>
> In a world with finite resources in terms of coding time and attention
> span,
> should we not provide such resources first for OS X and Linux?  But maybe
> we
> need some empirics first so that we can rank-order compilers by relevance.
>


 I would fully expect it to be the case that the distribution of build
issues would somewhat match the popularity of platforms.  Modulated by
something like the approximate expertise on the platforms.  E.g. you may
not see many problems from Solaris folks for a variety of reasons,
popularity being only one aspect of this...  :-)



> I for one get more error emails from Intel icc users (say, two a year) than
> from Solaris users (probably about one or two, over all these years; and
> yes
> I excluded the CRAN maintainers here).
>
> That said, we do have a concrete offer for help from Elijah which should a)
> be commended ("nice job!") and used



I'm going to get 'official permission' from someone, in hopes that whatever
we do lasts longer.  I'm expecting a positive reception, of course, or I
wouldn't have said it could be done.  :-)

(oh - yes, I got uniquivocal permission to go full speed ahead.  :-) )


Once I get that in hand, is B. Ripley the right person to reach out to?
 I'm pretty unfamiliar with what the CRAN autobuilder world is like, these
days.



> but b) we should maybe try to think about
> what other resources would help.
>

Yup, this seems like a good time to bring that up, for sure.  :-)

best,

--e

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Who does develop the R core and libs, and how / where is it hosted?

2013-01-15 Thread elijah wright
On Tue, Jan 15, 2013 at 12:46 PM, Hervé Pagès  wrote:


>  and given R's modularity that is fortunately not very often the case.
>>
>
> Modularity would be even better if more things *in core* were made
> generics. For example why the stuff in parallel was not made generic?
> (at least S3 generic)
>

The better to give interesting people GSoC projects with, of course!  ;-)

... nearly everyone appreciates patches that improve their projects to
noticeable benefit.  If you like parallel better a different way... collude
and make it rock.

best,

--e

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel