Re: [R-pkg-devel] Concise Summary For Any Variety Of R Function

2021-10-20 Thread Simon Urbanek


Dario,

there is not such thing as S4 function. However, all functions have an 
environment and for functions from packages that environment will be the 
namespace of the package. So in those special cases you can use 
environmentName() to get the name, e.g.:

> who = function(f) cat(deparse(substitute(f)), "from", 
> environmentName(environment(f)),"\n")
> who(ls)
ls from base 
> who(rnorm)
rnorm from stats 
> who(who)
who from R_GlobalEnv 

However, this won't work for functions defined in arbitrary environments which 
have no name:

> local({ f= function(x) x
+ who(f)
+ })
f from  

Depending on how much time you want to spend, you can do a search up the frames 
to find a named environment, but the concept you're trying to implement looks 
very fragile.

Cheers,
Simon



> On Oct 19, 2021, at 9:00 PM, Dario Strbenac  
> wrote:
> 
> Good day,
> 
> I would like to create a concise show method for an S4 class that I am 
> developing. One class slot stores a function specified by the end user. It 
> could be a basic function, an S3 function, an S4 function. For S4 functions, 
> I can get a concise representation:
> 
>> capture.output(showMethods(limmaSelection))[1]
> [1] "Function: limmaSelection (package ClassifyR)"
> 
> If the user specified the function bdiag from Matrix, how could I generalise 
> the show code to describe the function, such that the output would be 
> "Function bdiag from package Matrix" and simiarly for an S3 function?
> 
> --
> Dario Strbenac
> University of Sydney
> Camperdown NSW 2050
> Australia
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] How does one install a libtool generated libfoo.so.1 file into ./libs/?

2021-10-20 Thread Simon Urbanek


Pariksheet,

dynamic linking won't work, compile a static version with PIC enabled. If the 
subproject is autoconf-compatible this means using --disable-shared --with-pic. 
Then you only need to add libfoo.a to your PKG_LIBS.

Cheers,
Simon




> On Oct 19, 2021, at 4:13 PM, Pariksheet Nanda  
> wrote:
> 
> Hi folks,
> 
> My package [1] depends on a C library libghmm-dev that's available in many 
> GNU/Linux package managers.  However, it's not available on all platforms and 
> if this dependency is not installed, my autoconf generated configure script 
> defaults to falling back to compiling and installing the dependency from my 
> bundled copy of upstream's pristine source tarball [2].  Now, because 
> upstream uses automake which in turn uses libtool, I also use automake and 
> libtool in my build process to hook into their build artifacts using SUBDIRS 
> and *-local automake rules [3].
> 
> As you may know libtool appends `-version-info` to its generated shared 
> libraries in the form "libfoo.so.1.2.3".  I'm linking against the bundled 
> library which only sets the first value, namely libghmm.so.1.
> 
> The trouble is, R's installation process will only copy compiled files from 
> ./libs/ that have exactly the extension ".so" and files ending with ".so.1" 
> are ignored.
> 
> My current workaround is to set -Wl,-rpath to the location of the generated 
> ".so.1" file.  This allows the installation process to complete and sneakily 
> pass the 2 canonical tests:
> 
> 
> ** testing if installed package can be loaded from temporary location
> ---snip---
> ** testing if installed package can be loaded from final location
> 
> 
> However, not surprisingly, when I try to load the library from the final 
> location after the temporary directory has been deleted it fails with:
> 
> 
> library(tsshmm)
> ...
> Error: package or namespace load failed for 'tsshmm' indyn.load(file, DLLpath 
> = DLLpath, ...):
> unable to load shared object 
> '/home/omsai/R/x86_64-pc-linux-gnu-library/4.1/tsshmm/libs/tsshmm.so':
>  libghmm.so.1: cannot open shared object file: No such file or directory
> 
> 
> I can rename the dependency from ".so.1" to ".so" to also get the dependent 
> library to the final location.  But it still fails with the above error 
> because the library links against the ".so.1" file and I would need an 
> accompanying symlink.  I tried creating a symlink but can't think of how to 
> get the symlink to the final location.  If my Makefile writes the symlink 
> into ./inst/libs/libghmm.so.1 during compile time it is not actually 
> installed; perhaps because the ./inst/ sub-directories are only copied 
> earlier on when staging and are ignored later?  If I were to create that 
> dangling symlink inside ./inst/libs/ instead of generating it later during 
> compile time, devtools::install() complains about the broken symlink with:
> 
> 
> cp: cannot stat 'tsshmm/inst/libs/libghmm.so.1': No such file or directory
> 
> 
> So is there some mechanism to copy arbitrary files or symlinks to the final 
> install location?  I prefer not to patch upstreams Makefile.am to remove 
> their -version-info, but currently that's the only option I can think of.  I 
> can't find helpful discussion surrounding this in the mailing list archives.
> 
> Last week when I've posted for help with my package on another issue on the 
> Bioconductor mailing list, one adventurous soul tried installing the package 
> using `remotes::install_gitlab("coregenomics/tsshmm")`.  This won't work 
> because I haven't committed the generated autotools files; if anyone wants to 
> play with it, you'll have to follow the 2 additional steps run by the 
> continuous integration script, namely, unpacking ./src/ghmm-0.9-rc3.tar.gz 
> into ./src/ and running `autoreconf -ivf` in the package's top-level 
> directory where configure.ac is located.
> 
> Any help appreciated,
> 
> Pariksheet
> 
> 
> [1] https://gitlab.com/coregenomics/tsshmm
> 
> [2] The only patches I apply to the dependency are to fix 2 bugs for 
> compiling, and to remedy a warning severe enough to be flagged by `R CMD 
> check`.
> 
> [3] You can see my Makefile.am here:
> https://gitlab.com/coregenomics/tsshmm/-/blob/master/src/Makefile.am
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] CRAN Mac Builder based on M1

2021-10-20 Thread Simon Urbanek


Dear Mac useRs.

I'm pleased to announce that thanks to the R Foundation and donations from 
users like you we are now able to offer a CRAN Mac Builder based on M1 hardware 
which allows package authors that don't have access to a recent Mac to check 
their package using the same process as CRAN:

https://mac.r-project.org/macbuilder/submit.html

The machine mirrors all CRAN packages nightly and uses the CRAN building and 
checking system[1] so the results should be as close to the CRAN setup as 
feasible.
The resources are limited, so do not treat this as a CI setup, but rather as a 
service for package authors to facilitate checks before CRAN submissions.

The setup is new and experimental, so please contact me for any comments, 
problem reports or suggestions regarding the Mac Builder. For any Mac-related 
questions, please use R-SIG-Mac instead of this mailing list.

Cheers,
Simon

[1] - https://svn.r-project.org/R-dev-web/trunk/QA/Simon/packages

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] How to update the email address of a package maintainer

2021-11-08 Thread Simon Urbanek


>From CRAN Policy:


> Explain any change in the maintainer’s email address and if possible send 
> confirmation from the previous address (by a separate email to 
> cran-submissi...@r-project.org) or explain why it is not possible
> 


Cheers,
Simon

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Inconsistent available package dependencies

2021-11-18 Thread Simon Urbanek


Michael,

this is really a CRAN question rather than a package question so you may have 
more luck asking CRAN, but generally it is not a a good idea to have strong 
dependencies (Depends/Imports) on Bioconductor packages. If you want to use 
non-CRAN packages I would recommend weak depends like Suggests since your 
package won't work otherwise. For example, Bioconductor does not provide any 
binaries for the big-sur CRAN release of R, so your package cannot be installed 
by users if it depends on any BioC packages with native code. In fact, the 
check report you linked already warns you about your dependencies in general:

Version: 4.0.2
Check: package dependencies
Result: NOTE
Imports includes 22 non-default packages.
Importing from so many packages makes the package vulnerable to any of
them becoming unavailable. Move as many as possible to Suggests and
use conditionally.

FWIW the reason you got the notification later is that macOS checks are not 
done immediately on submission otherwise the package would not have been 
accepted in the first place. That said, if you have issues with dependencies on 
the macOS build machines just contact me and I can install them if they are 
available (I have now kick-started a script that attempts to automatically 
detect and install missing dependencies if available so you may see it in the 
next check run), but as discussed above you should still re-think your 
dependencies.

Cheers,
Simon


> On Nov 16, 2021, at 3:33 PM, Michael Hellstern  wrote:
> 
> Hi all,
> 
> I maintain a package (netgsa) that has been on CRAN since september 2021
> and recently got an email about CRAN check failures. There are 2 errors and
> 1 warning. The failures are listed here:
> https://cran.r-project.org/web/checks/check_results_netgsa.html.
> 
> The failures are a result of Bioconductor packages. Both errors are about
> package dependencies not being available (org.Hs.eg.db and/or ndexr) while
> the warning comes from the vignette building failing which is caused by an
> error in the graphite package. I checked and these packages were and still
> are available on Bioconductor. The errors occurred on macOS 10.13.6. While
> not exactly the same OS, I ran R CMD check --as-cran on my macOS 10.15.7
> and did not get any errors or warnings. It seems odd that I would get a
> package dependency availability error for only 2 flavors and that I would
> not be able to reproduce this on a similar OS.
> 
> I searched online but have not found anything useful. I built the package
> under R 4.0.2 which is now a little dated so am wondering if it is related
> to this? Do I need to rebuild using the most recent version of R? Any
> advice on how to resolve this would be greatly appreciated!
> 
> Best,
> 
> Mike
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] CRAN submission error when running tests in testthat

2021-11-25 Thread Simon Urbanek


Nathan,

testthat is notorious for obfuscation and unhelpful output as can be clearly 
seen in the head of testthat.Rout.fail:

> library(testthat)
> library(BCEA)

Attaching package: 'BCEA'

The following object is masked from 'package:graphics':

contour

> 
> test_check("BCEA")

 *** caught segfault ***
address 0x10d492ffc, cause 'memory not mapped'

However this appears to be hard to debug, because it is a fall out from some 
memory corruption and/or allocator mistmatch: the crash happens in free() while 
doing GC (see below). Since it happens in the GC, many bad things happen 
afterwards.
With some lldb magic I could trace that the crash happens during
 ..getNamespace(c("Matrix", "1.3-3"), "stanfit")
 load(test_path("data", "stanfit.RData"))
but as I said that's likely too late - the memory corruption/issue likely 
happened before. Since BCEA itself doesn't have native code, this is likely a 
bug in some of the packages it depends on, but quite a serious one since it 
affects subsequent code in R.

The list of packages loaded at the time of the crash - so one of them is the 
culprit:

 [1] "rstan"  "tidyselect" "purrr"  "reshape2"  
 [5] "lattice""V8" "colorspace" "vctrs" 
 [9] "generics"   "testthat"   "stats4" "BCEA"  
[13] "loo""grDevices"  "R2jags" "utf8"  
[17] "rlang"  "pkgbuild"   "pillar" "glue"  
[21] "withr"  "DBI""matrixStats""lifecycle" 
[25] "plyr"   "stringr""munsell""gtable"
[29] "coda"   "codetools"  "inline" "callr" 
[33] "ps" "parallel"   "curl"   "fansi" 
[37] "methods""Rcpp"   "scales" "desc"  
[41] "RcppParallel"   "StanHeaders""GrassmannOptim" "jsonlite"  
[45] "abind"  "gridExtra"  "winch"  "rjags" 
[49] "ggplot2""stats"  "datasets"   "graphics"  
[53] "stringi""processx"   "dplyr"  "grid"  
[57] "rprojroot"  "cli""tools"  "magrittr"  
[61] "tibble" "crayon" "pkgconfig"  "Matrix"
[65] "MASS"   "ellipsis"   "utils"  "prettyunits"   
[69] "assertthat" "base"   "boot"   "R6"
[73] "R2WinBUGS"  "compiler"  

My guess would be that the issue could be in RcppParallel which overrides the 
memory allocator:

* thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS 
(code=1, address=0x11649fffc)
  * frame #0: 0x0001097c517f libtbbmalloc.dylib`__TBB_malloc_safer_msize + 
63
frame #1: 0x7fff76f746fd libsystem_malloc.dylib`free + 96
frame #2: 0x0001001c9227 libR.dylib`RunGenCollect at memory.c:1114 [opt]
frame #3: 0x0001001c9038 libR.dylib`RunGenCollect(size_needed=0) at 
memory.c:1896 [opt]
frame #4: 0x0001001bf769 libR.dylib`R_gc_internal(size_needed=0) at 
memory.c:3129 [opt]

(lldb) image lookup -va 0x0001097c517f
  Address: libtbbmalloc.dylib[0x0001117f] 
(libtbbmalloc.dylib.__TEXT.__text + 65375)
  Summary: libtbbmalloc.dylib`__TBB_malloc_safer_msize + 63
   Module: file = 
"/Volumes/Builds/packages/high-sierra-x86_64/Rlib/4.1/RcppParallel/lib/libtbbmalloc.dylib",
 arch = "x86_64"
   Symbol: id = {0x060c}, range = 
[0x0001097c5140-0x0001097c5290), mangled="__TBB_malloc_safer_msize"

but that's just a wild guess... (CCing Kevin just in case he can shed a light 
on whether TBB allocator should be involved in regular R garbage collection).

Cheers,
Simon



> On Nov 25, 2021, at 5:37 AM, Nathan Green via R-package-devel 
>  wrote:
> 
> Hi,
> I've getting an ERROR when submitting a new release of our package BCEA to 
> CRAN which I'm having problems understanding and reproducing. Its passing 
> CHECK locally and GitHub Actions standard check 
> (https://github.com/n8thangreen/BCEA/actions/runs/1494595896).
> The message is something to do with testthat. Any help would be gratefully 
> received.
> Thanks!
> Nathan
> 
> From https://cran.r-project.org/web/checks/check_results_BCEA.html
> Here's the error message:
> Check: tests, Result: ERROR
>Running ‘testthat.R’ [5s/5s]
>  Running the tests in ‘tests/testthat.R’ failed.
>  Last 13 lines of output:
>33: tryCatch(withCallingHandlers({eval(code, test_env)if (!handled 
> && !is.null(test)) {skip_empty()}}, expectation = 
> handle_expectation, skip = handle_skip, warning = handle_warning, message 
> = handle_message, error = handle_error), error = handle_fatal, skip = 
> function(e) {})
>34: test_code(NULL, exprs, env)
>35: source_file(path, child_env(env), wrap = wrap)
>36: FUN(X[[i]], ...)
>37: lapply(test_paths, test_one_file, env = env, wrap = wrap)
>38: doTryCatch(return(expr), name, pare

[R-pkg-devel] rstan issue [Was: CRAN submission error when running tests in testthat]

2021-11-25 Thread Simon Urbanek


Kevin,

thanks, that's very helpful! So this is a serious bug in rstan - apparently 
they only do that on macOS which explains why other platforms don't see it:

.onLoad <- function(libname, pkgname) {
[...]
  ## the tbbmalloc_proxy is not loaded by RcppParallel which is linked
  ## in by default on macOS; unloading only works under R >= 4.0 so that
  ## this is only done for R >= 4.0
  if(R.version$major >= 4 && Sys.info()["sysname"] == "Darwin") {
  tbbmalloc_proxy  <- system.file("lib/libtbbmalloc_proxy.dylib", 
package="RcppParallel", mustWork=FALSE)
  tbbmalloc_proxyDllInfo <<- dyn.load(tbbmalloc_proxy, local = FALSE, now = 
TRUE)
  }

I can confirm that commenting out that part solves the segfault and BCEA passes 
the tests.

@Ben, please fix and submit a new version of rstan (see discussion below).

Thanks,
Simon



> On Nov 26, 2021, at 11:19 AM, Kevin Ushey  wrote:
> 
> That shouldn't be happening, at least not by default. However, RcppParallel 
> does ship with tbbmalloc_proxy, which is a library that, when loaded, will 
> overload the default allocators to use TBB's allocators instead. The 
> intention is normally that these libraries would be loaded via e.g. 
> LD_PRELOAD or something similar, since changing the allocator at runtime 
> would cause these sorts of issues.
> 
> If I test with the following:
> 
> trace(dyn.load, quote({
>   if (grepl("tbbmalloc_proxy", x))
> print(rlang::trace_back())
> }), print = FALSE)
> 
> devtools::test()
> 
> then I see:
> 
>   1. ├─base::load(test_path("data", "stanfit.RData")) at test-bcea.R:179:2
>   2. └─base::..getNamespace(``, "stanfit")
>   3.   ├─base::tryCatch(...)
>   4.   │ └─base tryCatchList(expr, classes, parentenv, handlers)
>   5.   │   └─base tryCatchOne(expr, names, parentenv, handlers[[1L]])
>   6.   │ └─base doTryCatch(return(expr), name, parentenv, handler)
>   7.   └─base::loadNamespace(name)
>   8. └─base runHook(".onLoad", env, package.lib, package)
>   9.   ├─base::tryCatch(fun(libname, pkgname), error = identity)
>  10.   │ └─base tryCatchList(expr, classes, parentenv, handlers)
>  11.   │   └─base tryCatchOne(expr, names, parentenv, handlers[[1L]])
>  12.   │ └─base doTryCatch(return(expr), name, parentenv, handler)
>  13.   └─rstan fun(libname, pkgname)
>  14. └─base::dyn.load(tbbmalloc_proxy, local = FALSE, now = TRUE)
> 
> My guess is that the 'rstan' package is trying to forcefully load 
> libtbbmalloc_proxy.dylib at runtime, and that's causing the issue. IMHO 
> 'rstan' shouldn't be doing that, at least definitely not by default.
> 
> Best,
> Kevin
> 
> On Thu, Nov 25, 2021 at 12:54 PM Simon Urbanek  
> wrote:
> Nathan,
> 
> testthat is notorious for obfuscation and unhelpful output as can be clearly 
> seen in the head of testthat.Rout.fail:
> 
> > library(testthat)
> > library(BCEA)
> 
> Attaching package: 'BCEA'
> 
> The following object is masked from 'package:graphics':
> 
> contour
> 
> > 
> > test_check("BCEA")
> 
>  *** caught segfault ***
> address 0x10d492ffc, cause 'memory not mapped'
> 
> However this appears to be hard to debug, because it is a fall out from some 
> memory corruption and/or allocator mistmatch: the crash happens in free() 
> while doing GC (see below). Since it happens in the GC, many bad things 
> happen afterwards.
> With some lldb magic I could trace that the crash happens during
>  ..getNamespace(c("Matrix", "1.3-3"), "stanfit")
>  load(test_path("data", "stanfit.RData"))
> but as I said that's likely too late - the memory corruption/issue likely 
> happened before. Since BCEA itself doesn't have native code, this is likely a 
> bug in some of the packages it depends on, but quite a serious one since it 
> affects subsequent code in R.
> 
> The list of packages loaded at the time of the crash - so one of them is the 
> culprit:
> 
>  [1] "rstan"  "tidyselect" "purrr"  "reshape2"  
>  [5] "lattice""V8" "colorspace" "vctrs" 
>  [9] "generics"   "testthat"   "stats4" "BCEA"  
> [13] "loo""grDevices"  "R2jags" "utf8"  
> [17] "rlang"  "pkgbuild"   "pillar" "glue"  
> [21]

Re: [R-pkg-devel] rstan issue [Was: CRAN submission error when running tests in testthat]

2021-11-26 Thread Simon Urbanek


Nathan,

no action is needed on your end since it's not your fault. It was good of you 
to have the test there because it unearthed the issue. I have re-run the test 
with hot-fixed rstan and it passes the check so you're good as far as I'm 
concerned. As you say, since it's not a strong dependency users can use your 
package even if rstan is broken. More urgently we need an update from rstan and 
stanette (.onUnoad needs corresponding fix in both cases as well).

Cheers,
Simon


> On Nov 27, 2021, at 2:10 AM, Nathan Green  wrote:
> 
> Thanks everyone for your help with this.
> TBH the technical nature is more than a little above my head.
> What are you advising as the current solution to enable me to resubmit the 
> package to CRAN?
> BCEA doesn't have a strong dependency on rstan so I could remove it as a 
> simple solution?
> Thanks again for all the help!
> 
> Nathan
> 
> 
> 
> 
> Dr Nathan Green
> 
> @: n8thangr...@yahoo.co.uk
> Tel: 07821 318353
> 
> 
> 
> On Thursday, 25 November 2021, 22:56:45 GMT, Simon Urbanek 
>  wrote:
> 
> 
> Kevin,
> 
> thanks, that's very helpful! So this is a serious bug in rstan - apparently 
> they only do that on macOS which explains why other platforms don't see it:
> 
> .onLoad <- function(libname, pkgname) {
> [...]
>   ## the tbbmalloc_proxy is not loaded by RcppParallel which is linked
>   ## in by default on macOS; unloading only works under R >= 4.0 so that
>   ## this is only done for R >= 4.0
>   if(R.version$major >= 4 && Sys.info()["sysname"] == "Darwin") {
>   tbbmalloc_proxy  <- system.file("lib/libtbbmalloc_proxy.dylib", 
> package="RcppParallel", mustWork=FALSE)
>   tbbmalloc_proxyDllInfo <<- dyn.load(tbbmalloc_proxy, local = FALSE, now 
> = TRUE)
>   }
> 
> I can confirm that commenting out that part solves the segfault and BCEA 
> passes the tests.
> 
> @Ben, please fix and submit a new version of rstan (see discussion below).
> 
> Thanks,
> Simon
> 
> 
> 
> > On Nov 26, 2021, at 11:19 AM, Kevin Ushey  wrote:
> > 
> > That shouldn't be happening, at least not by default. However, RcppParallel 
> > does ship with tbbmalloc_proxy, which is a library that, when loaded, will 
> > overload the default allocators to use TBB's allocators instead. The 
> > intention is normally that these libraries would be loaded via e.g. 
> > LD_PRELOAD or something similar, since changing the allocator at runtime 
> > would cause these sorts of issues.
> > 
> > If I test with the following:
> > 
> > trace(dyn.load, quote({
> >  if (grepl("tbbmalloc_proxy", x))
> >print(rlang::trace_back())
> > }), print = FALSE)
> > 
> > devtools::test()
> > 
> > then I see:
> > 
> >  1. ├─base::load(test_path("data", "stanfit.RData")) at test-bcea.R:179:2
> >  2. └─base::..getNamespace(``, "stanfit")
> >  3.  ├─base::tryCatch(...)
> >  4.  │ └─base tryCatchList(expr, classes, parentenv, handlers)
> >  5.  │  └─base tryCatchOne(expr, names, parentenv, handlers[[1L]])
> >  6.  │└─base doTryCatch(return(expr), name, parentenv, handler)
> >  7.  └─base::loadNamespace(name)
> >  8.└─base runHook(".onLoad", env, package.lib, package)
> >  9.  ├─base::tryCatch(fun(libname, pkgname), error = identity)
> >  10.  │ └─base tryCatchList(expr, classes, parentenv, handlers)
> >  11.  │  └─base tryCatchOne(expr, names, parentenv, handlers[[1L]])
> >  12.  │└─base doTryCatch(return(expr), name, parentenv, handler)
> >  13.  └─rstan fun(libname, pkgname)
> >  14.└─base::dyn.load(tbbmalloc_proxy, local = FALSE, now = TRUE)
> > 
> > My guess is that the 'rstan' package is trying to forcefully load 
> > libtbbmalloc_proxy.dylib at runtime, and that's causing the issue. IMHO 
> > 'rstan' shouldn't be doing that, at least definitely not by default.
> > 
> > Best,
> > Kevin
> > 
> > On Thu, Nov 25, 2021 at 12:54 PM Simon Urbanek 
> >  wrote:
> > Nathan,
> > 
> > testthat is notorious for obfuscation and unhelpful output as can be 
> > clearly seen in the head of testthat.Rout.fail:
> > 
> > > library(testthat)
> > > library(BCEA)
> > 
> > Attaching package: 'BCEA'
> > 
> > The following object is masked from 'package:graphics':
> > 
> >contour
> > 
> > > 
>

Re: [R-pkg-devel] mget with Inherits Not Finding Variable in Caller

2021-11-30 Thread Simon Urbanek


Dario,


> On Dec 1, 2021, at 12:00 PM, Dario Strbenac  
> wrote:
> 
> Good day,
> 
> What I am misunderstanding about the inherits = TRUE option of mget? I expect 
> the small example to work.
> 
> f <- function(x, .iteration = i) g() 
> g <- function() mget(".iteration", inherits = TRUE)
> f(10, 1)
> Error: value for ‘.iteration’ not found
> 

That has nothing to do with inherits and is expected - it's identical to

> f <- function(x, .iteration = i) g() 
> g <- function() .iteration
> f(10, 1)
Error in g() : object '.iteration' not found

Please note that R is lexically scoped and you defined g in the global 
environment so it has no way of seeing inside f. This would work:

> f <- function(x, .iteration = i) {
+   g <- function() .iteration
+   g() 
+ }
> f(10, 1)
[1] 1

since then the environment of f is the parent env of g.

If you want dynamic scoping (not what R uses!) you can use dynGet():

> f <- function(x, .iteration = i) g() 
> g <- function() dynGet(".iteration")
> f(10,1)
[1] 1

but since that is non-standard the docs warn:

 ‘dynGet()’ is somewhat experimental and to be used _inside_
 another function.  It looks for an object in the callers, i.e.,
 the ‘sys.frame()’s of the function.  Use with caution.

Cheers,
Simon




> --
> Dario Strbenac
> University of Sydney
> Camperdown NSW 2050
> Australia
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R Vignette Knitting Issues in CRAN Release

2021-12-09 Thread Simon Urbanek
Eric,

did you check the contents of the package file you submitted? The session info 
in the vignette is quite old, and the build has been packaged by you so I don't 
think it has anything to do with CRAN, but to make sure, check the file you 
submitted.

Cheers,
Simon


> On Dec 10, 2021, at 10:52 AM, Eric Weine  wrote:
> 
> Hello,
> 
> I'm having an issue where my vignette created using R CMD build does not
> match the vignette created using devtools::build_vignettes or when I knit
> with RStudio. I posted about this issue on stack overflow here
> .
> Originally I thought that this might be related to an omission in the
> DESCRIPTION file, but this no longer seems to be the case. Here is a repost
> of my initial stack overflow post:
> 
> I recently released a vignette with version 1.1 of my R package. The Rmd
> for the vignette can be found here
> .
> When I create the vignette locally, I see author information and the table
> of contents at the top of the vignette, as expected. However, when I
> submitted this package to CRAN and the vignette was created there
> ,
> I no longer see the table of contents or author information. Does anyone
> know why this may be happening?
> 
> Thanks,
> 
> Eric.
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] pandoc missing on r-{release,oldrel}-macos-x86_64

2021-12-16 Thread Simon Urbanek
Sure, installed pandoc 2.16.2.

Cheers,
Simon


> On Dec 17, 2021, at 9:21 AM, Dirk Eddelbuettel  wrote:
> 
> 
> CRAN results flag NOTEs on the two platforms
>   r-release-macos-x86_64
>   r-oldrel-macos-x86_64
> because `pandoc` is apparently missing.  These platforms being somewhat
> common, could pandoc be installed?  Or are they running such a jurassic
> version that no premade pandoc is available _anywhere_ ?
> 
> Simon, can you advise?
> 
> Thanks,  Dirk
> 
> -- 
> https://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] UTF-8 characters inside R functions for data transformation

2021-12-21 Thread Simon Urbanek
Xavier,

short answer is no, because there is no guarantee that user's system supports 
any encoding other than ASCII, so that code wouldn't run. Hence you can't use 
non-ASCII characters in symbols.

That said, you can use Unicode _strings_, so metadata[["\u00e1cc\u00e9nts"]] 
will work in ASCII-locale, but I would strongly caution against such objects in 
public packages.

Cheers,
Simon


> On Dec 22, 2021, at 9:18 AM, X FiM  wrote:
> 
> Dear all,
> 
> Somewhat related to a question that I posted a while ago (see
> https://stat.ethz.ch/pipermail/r-package-devel/2021q4/007540.html), once
> I've got a dataframe in my cache, some of the functions need to use some of
> the variables. It turns out that some of the columns contain UTF-8
> characters, and therefore I need to be able to call
> `metadata$variable.with.áccénts`.
> 
> But the package development checks warn me that no non-ASCII characters are
> allowed in the files "checking R files for non-ASCII characters ...
> WARNING".
> 
> I have specified that the package uses UTF-8 in DESCRIPTION ("Encoding:
> UTF-8"). I have also defined "options(encoding = "UTF-8")" before calling
> the checks, but nothing seems to matter. I have also tried to give the
> proper UTF-8 codes, like in `metadata$variable.with.\u00e1cc\u00e9nts`, but
> with no luck either. Also it gives an error with "\u sequences not
> supported inside backticks".
> 
> So which approach do you recommend? Is there any solution that I can use to
> call variables that use non-ASCII characters?
> 
> Thank you very much.
> 
> -- 
> Xavier
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Package accepted, then an error to fix by 3/22

2022-03-08 Thread Simon Urbanek
Michael,

I can replicate the problem on macOS so this is not system-specific - did you 
actually try to re-build the vignette? As can be seen from the output the 
problem is is the following line:

one_rec %>% pull(abstract_text) %>% print   
  

because it is an extremely long output line (4712 characters) that exceeds the 
LaTeX limit. The actual content is quite messy so I wouldn't try to output the 
whole content, but if you really want to, you may need to wrap it first since R 
output is not soft-wrapped.

Cheers,
Simon



> On Mar 9, 2022, at 5:37 AM, Michael Barr  wrote:
> 
> Hi - had my first package accepted yesterday and published on CRAN. 
> Immediately after, I get an auto email from Prof Ripley stating there is an 
> error to be corrected or else the package  will be archived. See the relevant 
> message from check results link below.
> 
> From Google it appears to be related to a graphic in my vignette though not 
> entirely certain of that. I have precisely one image in my vignette; I�ve 
> passed all the checks locally and for the CRAN submission, so I am stumped 
> how to go about reproducing this error or fixing it. Looking for advice.
> 
> Thanks
> Mike
> 
> 
>   ! Dimension too large.
>\lsthk@InitVarsEOL ->\ifdim \lst@currlwidth
>> \lst@maxwidth \global \lst@maxw...
>l.857 ...nt  pharmacologic  approaches."
> 
>Error: processing vignette 'repoRter_nih.Rmd' failed with diagnostics:
>LaTeX failed to compile 
> /data/gannet/ripley/R/packages/tests-clang/repoRter.nih.Rcheck/vign_test/repoRter.nih/vignettes/repoRter_nih.tex.
>  See https://yihui.org/tinytex/r/#debugging for debugging tips. See 
> repoRter_nih.log for more info.
>--- failed re-building �repoRter_nih.Rmd�
> 
>SUMMARY: processing the following file failed:
> �repoRter_nih.Rmd�
> 
>Error: Vignette re-building failed.
>Execution halted
> Flavor: 
> r-devel-linux-x86_64-fedora-clang
> 
> 
> 
> 
> 
> 
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] How to fix Tcl/Tk Error on Mac OS build

2022-04-26 Thread Simon Urbanek
Thanks, indeed, as part of full re-install for R 4.2.0 package builds the 
Tcl/Tk libraries were missing. Now fixed and summarytools check with

* checking data for non-ASCII characters ... NOTE
  Note: found 78 marked UTF-8 strings
Status: 1 NOTE

I'll just repeat the important link that Tomas posted: 
https://cran.r-project.org/web/packages/external_libs.html

On a slightly orthogonal note (not related to the question other than it 
involves tcltk): I'd like to point out that Tcl/Tk is often not available on 
computing servers, so I would recommend packages that do not provide GUI tools 
to avoid dependence on tcltk if not strictly necessary as it prevents the use 
in headless HPC environments.

Cheers,
Simon


> On Apr 26, 2022, at 2:45 PM, Dominic Comtois  
> wrote:
> 
> I was asked by CRAN to fix some problems appearing in my summarytools
> '
> package build checks. Trivial email problems aside, on the summary page
> 
> for the Mac OS build, I see:
> 
> [...]
> checking whether package ‘summarytools’ can be installed ... [1s/1s] ERROR
> Installation failed.
> See
> ‘/Volumes/Builds/packages/big-sur-arm64/results/4.2/summarytools.Rcheck/00install.out’
> for details.
> [...]
> 
> On the details page
> ,
> I see:
> 
> [...]
> ** byte-compile and prepare package for lazy loadingtcltk DLL is
> linked to '/opt/R/arm64/lib/libtk8.6.dylib'
> Error: .onLoad failed in loadNamespace() for 'tcltk', details:
>  call: fun(libname, pkgname)
>  error: Tcl/Tk libraries are missing: install the Tcl/Tk component
> from the R installerExecution halted
> ERROR: lazy loading failed for package ‘summarytools’
> [...]
> 
> Tcl/Tk is used by some functions to allow users to bring up an "open
> file...", & "save file..." dialogs & message boxes, as well as the
> tclvalue() function. In my NAMESPACE, I have:
> 
> importFrom(tcltk,tclvalue)
> importFrom(tcltk,tk_messageBox)
> importFrom(tcltk,tkgetOpenFile)
> importFrom(tcltk,tkgetSaveFile)
> 
> Also, a prerequisite is to have XQuartz 
> installed on Mac OS.
> 
> Summarytools has been around for a while, and I never had this problem
> before. And since I don't have a Mac machine, I'm not sure how to go about
> this. A Google search reveals that several other packages have the same
> issue. But I wasn't able to find the root cause nor a proper solution. Any
> tips / pointers would be greatly appreciated.
> 
> Thanks,
> 
> Dominic Comtois
> summarytools
>  author &
> maintainer
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] windows i386

2022-09-14 Thread Simon Urbanek
Florian,

since there was no direct response and given the earlier discussion I figured I 
chime. The main problem seems to be your build system that doesn't work. Since 
you didn't post the actual version of the package, I can only see the CRAN 
version which still don't set any of the necessary flags from R so it won't 
work (the error is that it can't even use the C compiler). Your package has to 
work both in 32-bit and 64-bit, so you need to make sure you pass the correct 
flags for cmake for each architecture - it's really the same concept as 
discussed earlier. As for multi-arch, if you have configure.win then you may 
need to use --merge-multiarch such that each architecture is built separately 
and the merged into one binary.

I think an easier approach would be if you simply dropped configure* and just 
used Makevars to build the dependent library as part of the build process since 
you are building in a sub-directory, so you don't really need the configure and 
could build both archs in one R CMD INSTALL run.

Cheers,
Simon



> On 13/09/2022, at 11:14 PM, Florian Schwendinger  
> wrote:
> 
> Dear R-package-developers,
> 
> On 'r-oldrel-windows-ix86_x86_64' I get in 'check.log' the error message
> "Error: package 'highs' is not installed for 'arch = i386'"
> the statement that for arch = i386 the package is not installed is correct 
> and expected,
> since in 'install.out' I see the warning
> "Warning: this package has a non-empty 'configure.win' file, so building only 
> the main architecture"
> 
> Looking at r-package-devel archive, I found the suggestion to set Biarch TRUE 
> in the DESCRIPTION file.
> I don't want to force the build of "i386" by setting Biarch: TRUE,
> since as far as I see from the source I link to this should not work and I 
> don't want
> to alter the library I link to as little as possible.
> Therefore, I want skip the step "loading checks for arch 'i386'". Is this 
> possible?
> 
> I know locally this can be resolved by adding the option '--no-multiarch',
> but can a similar effect be accomplished at the CRAN checks?
> Furthermore, I know I can could this issue by adding R >= 4.2 to the depends 
> since Rtools 4.2
> does not build for i386. But this does not seem like a nice fix.
> 
> Is there a better option and how would it work?
> 
> Best regards,
> Florian
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R build fail without a message

2022-09-26 Thread Simon Urbanek
Paul,

I wouldn't worry about oldrel - that's likely an incomplete run (I don't see 
that error anymore), but I would worry about the failure on R-release:
https://www.r-project.org/nosvn/R.check/r-release-macos-arm64/EdSurvey-00check.html

You can always check with the Mac Builder before you submit it to CRAN:
https://mac.r-project.org/macbuilder/submit.html

Cheers,
Simon



> On 27/09/2022, at 10:03 AM, Bailey, Paul  wrote:
> 
> Hi,
> 
> One of my CRAN packages gets an ARM-64 build fail, visible at 
> https://www.r-project.org/nosvn/R.check/r-oldrel-macos-arm64/EdSurvey-00check.html
>  
> 
> It ends:
> 
>checking replacement functions ... OK
>checking foreign function calls ...
> 
> It looks like someone tripped over the power chord, but I have no way of 
> knowing what actually happened.
> 
> I cannot reproduce this on Rhub for R-release on ARM-64 nor on my coworker's 
> m1 on R-release. Any ideas on what I can do to diagnose this before 
> resubmitting?
> 
> Best,
> Paul Bailey, Ph.D.
> Senior Economist, AIR
> 202-403-5694
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Issue handling datetimes: possible differences between computers

2022-10-09 Thread Simon Urbanek
Alexandre,

it's better to parse the timestamp in correct timezone:

> foo = as.POSIXlt("2021-10-01", "UTC")
> as.POSIXct(as.character(foo), "Europe/Berlin")
[1] "2021-10-01 CEST"

The issue stems from the fact that you are pretending like your timestamp is 
UTC (which it is not) while you want to interpret the same values in a 
different time zone. The DST flags varies depending on the day (due to DST 
being 0 or 1 depending on the date) and POSIXlt does not have that information 
since you only attached the time zone without updating it:

> str(unclass(as.POSIXlt(foo, "Europe/Berlin")))
List of 9
 $ sec  : num 0
 $ min  : int 0
 $ hour : int 0
 $ mday : int 1
 $ mon  : int 9
 $ year : int 121
 $ wday : int 5
 $ yday : int 273
 $ isdst: int 0
 - attr(*, "tzone")= chr "Europe/Berlin"

note that isdst is 0 from the UTC entry (which doesn't have DST) even though 
that date is actually DST in CEST. Compare that to the correctly parsed POSIXlt:

> str(unclass(as.POSIXlt(as.character(foo), "Europe/Berlin")))
List of 11
 $ sec   : num 0
 $ min   : int 0
 $ hour  : int 0
 $ mday  : int 1
 $ mon   : int 9
 $ year  : int 121
 $ wday  : int 5
 $ yday  : int 273
 $ isdst : int 1
 $ zone  : chr "CEST"
 $ gmtoff: int NA
 - attr(*, "tzone")= chr "Europe/Berlin"

where isdst is 1 since it is indeed the DST. The OS difference seems to be that 
Linux respects the isdst information from POSIXlt while Windows and macOS 
ignores it. This behavior is documented: 

 At all other times ‘isdst’ can be deduced from the
 first six values, but the behaviour if it is set incorrectly is
 platform-dependent.

You can re-set isdst to -1 to make sure R will try to determine it:

> foo$isdst = -1L
> as.POSIXct(foo, "Europe/Berlin")
[1] "2021-10-01 CEST"

So, generally, you cannot simply change the time zone in POSIXlt - don't 
pretend the time is in UTC if it's not, you have to re-parse or re-compute the 
timestamps for it to be reliable or else the DST flag will be wrong.

Cheers,
Simon


> On 10/10/2022, at 1:14 AM, Alexandre Courtiol  
> wrote:
> 
> Hi R pkg developers,
> 
> We are facing a datetime handling issue which manifests itself in a
> package we are working on.
> 
> In context, we noticed that reading datetime info from an excel file
> resulted in different data depending on the computer we used.
> 
> We are aware that timezone and regional settings are general sources
> of troubles, but the code we are using was trying to circumvent this.
> We went only as far as figuring out that the issue happens when
> converting a POSIXlt into a POSIXct.
> 
> Please find below, a minimal reproducible example where `foo` is
> converted to `bar` on two different computers.
> `foo` is a POSIXlt with a defined time zone and upon conversion to a
> POSIXct, despite using a set time zone, we end up with `bar` being
> different on Linux and on a Windows machine.
> 
> We noticed that the difference emerges from the system call
> `.Internal(as.POSIXct())` within `as.POSIXct.POSIXlt()`.
> We also noticed that the internal function in R actually calls
> getenv("TZ") within C, which is probably what explains where the
> difference comes from.
> 
> Such a behaviour is probably expected and not a bug, but what would be
> the strategy to convert a POSIXlt into a POSIXct that would not be
> machine dependent?
> 
> We finally noticed that depending on the datetime used as a starting
> point and on the time zone used when calling `as.POSIXct()`, we
> sometimes have a difference between computers and sometimes not...
> which adds to our puzzlement.
> 
> Many thanks.
> Alex & Liam
> 
> 
> ``` r
> ## On Linux
> foo <- structure(list(sec = 0, min = 0L, hour = 0L, mday = 1L, mon =
> 9L, year = 121L, wday = 5L, yday = 273L, isdst = 0L),
> class = c("POSIXlt", "POSIXt"), tzone = "UTC")
> 
> bar <- as.POSIXct(foo, tz = "Europe/Berlin")
> 
> bar
> #> [1] "2021-10-01 01:00:00 CEST"
> 
> dput(bar)
> #> structure(1633042800, class = c("POSIXct", "POSIXt"), tzone =
> "Europe/Berlin")
> ```
> 
> ``` r
> ## On Windows
> foo <- structure(list(sec = 0, min = 0L, hour = 0L, mday = 1L, mon =
> 9L, year = 121L, wday = 5L, yday = 273L, isdst = 0L),
> class = c("POSIXlt", "POSIXt"), tzone = "UTC")
> 
> bar <- as.POSIXct(foo, tz = "Europe/Berlin")
> 
> bar
> #> [1] "2021-10-01 CEST"
> 
> dput(bar)
> structure(1633046400, class = c("POSIXct", "POSIXt"), tzone = "Europe/Berlin")
> ```
> 
> -- 
> Alexandre Courtiol, www.datazoogang.de
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Issue handling datetimes: possible differences between computers

2022-10-10 Thread Simon Urbanek
Liam,

I think I have failed to convey my main point in the last e-mail - which was 
that you want to parse the date/time in the timezone that you care about so in 
your example that would be

> foo <- as.Date(33874, origin = "1899-12-30")
> foo
[1] "1992-09-27"
> as.POSIXlt(as.character(foo), "Europe/Berlin")
[1] "1992-09-27 CEST"

I was explicitly saying that you do NOT want to simply change the time zone on 
POSIXlt objects as that won't work for reasons I explained - see my last e-mail.

Cheers,
Simon


> On 11/10/2022, at 6:31 AM, Liam Bailey  wrote:
> 
> Hi all,
> 
> Thanks Simon for the detailed response, that helps us understand a lot better 
> what’s going on! However, with your response in mind, we still encounter some 
> behaviour that we did not expect.
> 
> I’ve included another minimum reproducible example below to expand on the 
> situation. In this example, `foo` is a Date object that we generate from a 
> numeric input. Following your advice, `bar` is then a POSIXlt object where we 
> now explicitly define timezone using argument tz. However, even though we are 
> explicit about the timezone the POSIXlt that is generated is always in UTC. 
> This then leads to the issues outlined by Alexandre above, which we now 
> understand are caused by DST.
> 
> ``` r
> #Generate date from numeric
> #Not possible to specify tz at this point
> foo <- as.Date(33874, origin = "1899-12-30")
> dput(foo)
> #> structure(8305, class = "Date")
> 
> #Convert to POSIXlt specifying UTC timezone
> bar <- as.POSIXlt(foo, tz = "UTC")
> dput(bar)
> #> structure(list(sec = 0, min = 0L, hour = 0L, mday = 27L, mon = 8L, 
> #> year = 92L, wday = 0L, yday = 270L, isdst = 0L), class = c("POSIXlt", 
> #> "POSIXt"), tzone = "UTC")
> 
> #Convert to POSIXlt specifying Europe/Berlin.
> #Time zone is still UTC
> bar <- as.POSIXlt(foo, tz = "Europe/Berlin")
> dput(bar)
> #> structure(list(sec = 0, min = 0L, hour = 0L, mday = 27L, mon = 8L, 
> #> year = 92L, wday = 0L, yday = 270L, isdst = 0L), class = c("POSIXlt", 
> #> "POSIXt"), tzone = "UTC")
> ```
> 
> 
> We noticed that this occurs because the tz argument is not passed to 
> `.Internal(Date2POSIXlt())` inside `as.POSIXlt.Date()`.
> 
> Reading through the documentation for `as.POSIX*` we can see that this 
> behaviour is described:
> 
>   > “Dates without times are treated as being at midnight UTC.”
> 
> In this case, if we want to convert a Date object to POSIX* and specify a 
> (non-UTC) timezone would the best strategy be to first coerce our Date object 
> to character? Alternatively, `lubridate::as_datetime()` does seem to 
> recognise the tz argument and convert a Date object to POSIX* with non-UTC 
> time zone (see second example below). But it would be nice to know if there 
> are subtle differences between these two approaches that we should be aware 
> of.
> 
> ``` r
> foo <- as.Date(33874, origin = "1899-12-30")
> dput(foo)
> #> structure(8305, class = "Date")
> 
> #Convert to POSIXct specifying UTC timezone
> bar <- lubridate::as_datetime(foo, tz = "UTC")
> dput(as.POSIXlt(bar))
> #> structure(list(sec = 0, min = 0L, hour = 0L, mday = 27L, mon = 8L, 
> #> year = 92L, wday = 0L, yday = 270L, isdst = 0L), class = c("POSIXlt", 
> #> "POSIXt"), tzone = "UTC")
> 
> #Convert to POSIXct specifying Europe/Berlin
> bar <- lubridate::as_datetime(foo, tz = "Europe/Berlin")
> dput(as.POSIXlt(bar))
> #> structure(list(sec = 0, min = 0L, hour = 0L, mday = 27L, mon = 8L, 
> #> year = 92L, wday = 0L, yday = 270L, isdst = 1L, zone = "CEST", 
> #> gmtoff = 7200L), class = c("POSIXlt", "POSIXt"), tzone = 
> c("Europe/Berlin", 
> #> "CET", "CEST"))
> ```
> 
> Thanks again for all your help.
> Alex & Liam
> 
>> On 10 Oct 2022, at 6:40 pm, Hadley Wickham  wrote:
>> 
>> On Sun, Oct 9, 2022 at 9:31 PM Jeff Newmiller  
>> wrote:
>>> 
>>> ... which is why tidyverse functions and Python datetime handling irk me so 
>>> much.
>>> 
>>> Is tidyverse time handling intrinsically broken? They have a standard 
>>> practice of reading time as UTC and then using force_tz to fix the 
>>> "mistake". Same as Python.
>> 
>> Can you point to any docs that lead you to this conclusion so we can
>> get them fixed? I strongly encourage people to parse date-times in the
>> correct time zone; this is why lubridate::ymd_hms() and friends have a
>> tz argument.
>> 
>> Hadley
>> 
>> -- 
>> http://hadley.nz
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R, Rust and CRAN

2022-11-11 Thread Simon Urbanek
Florian,

this does not directly address your question, but I think it would make a lot 
of sense to standardize the process, given how many issues there were with 
packages using Rust (mainly not detecting compilers correctly, not supplying 
source code, unsolicited writing into user's directories, not checking binaries 
etc.). Doing this right is not entirely trivial which is presumably where the 
"friction" comes from. I'm not a Rust user myself, but just based on the other 
languages that have interfaces from R I would think that Rust users could 
coalesce and write a package that does the heavy lifting and can be used by 
packages that contain Rust code as a foundation - that's what most other 
language interfaces do. (It's quite possible that some of the projects you 
listed would be a good start). I would not recommend putting that burden onto 
each package author as it screams maintenance nightmare.

Cheers,
Simon


> On Nov 12, 2022, at 12:31 AM, Florian Rupprecht  wrote:
> 
> Hi all,
> 
> Are there any current recommendations on integrating Rust (and Cargo, its
> official package manager) in an R package complying to CRAN's policies?
> 
> To be clear: This question is not about how to integrate Rust in the
> package, it is about how to do it without creating friction with the CRAN
> team and infrastructure. I want to write the Rust-C-R interface and build
> scripts myself.
> 
> To me, Rust seems like a very good fit for R interop as it has a native C
> FFI, and has address and UB safety guarantees that top the strictest C++
> compiler warnings. However Rust's standard library is very small by design,
> so Cargo integration would be needed.
> 
> I know there is:
> 
> - rextendr (https://cran.r-project.org/package=rextendr):
> I don't think there is a package using this on CRAN yet.
> 
> - cargo-framework (https://CRAN.R-project.org/package=cargo)
> Removed from CRAN.
> 
> - r-rust/gifski (https://cran.r-project.org/package=gifski)
> Downloads precompiled binaries on windows for CRAN (which is, as I
> understand it, strongly discouraged for a number of reasons:
> https://stat.ethz.ch/pipermail/r-devel/2022-September/082027.html).
> 
> Thank you very much for your time,
> Florian
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R, Rust and CRAN

2022-11-12 Thread Simon Urbanek
Jeroen,

sorry, I think you misunderstood: CRAN machines have the compilers, but the 
packages were not detecting it properly and/or were violating the CRAN policies 
as noted, so that's why I was saying that it would make sense to have a unified 
approach so that each package author doesn't have to make the same mistakes 
over again.

Cheers,
Simon


> On Nov 13, 2022, at 7:27 AM, Jeroen Ooms  wrote:
> 
> On Sat, Nov 12, 2022 at 12:49 AM Simon Urbanek
>  wrote:
>> 
>> this does not directly address your question, but I think it would make a 
>> lot of sense to standardize the process, given how many issues there were 
>> with packages using Rust (mainly not detecting compilers correctly, not 
>> supplying source code, unsolicited writing into user's directories, not 
>> checking binaries etc.). Doing this right is not entirely trivial which is 
>> presumably where the "friction" comes from.
> 
> All we need is the rustc/cargo toolchain to be installed on the CRAN
> win/mac builders (it already is on Fedora/Debian). As mentioned above
> (and before in e.g. https://jeroen.github.io/erum2018), rust uses the
> native C FFI, does not need any runtime library, and can be called
> directly from an R package using the C interface.  There is really no
> need for frameworks or engines (like Rcpp/rJava/V8), this is precisely
> what makes Rust so nice as an embedded language, and why it is being
> adopted in many C projects such the Linux kernel, that would never
> want to deal with Java/C++/JS.
> 
>> I'm not a Rust user myself, but just based on the other languages that have 
>> interfaces from R I would think that Rust users could coalesce and write a 
>> package that does the heavy lifting and can be used by packages that contain 
>> Rust code as a foundation - that's what most other language interfaces do. 
>> (It's quite possible that some of the projects you listed would be a good 
>> start). I would not recommend putting that burden onto each package author 
>> as it screams maintenance nightmare.
> 
> I understand the sentiment but Rust is very different from e.g. Java.
> There really isn't much "heavy lifting" to do, because there are no
> complex type conversions or runtime library involved. If the same
> structs are used in C and Rust, the C functions can directly call Rust
> functions and vice versa. Therefore it is possible for libraries to
> incrementally port pieces of C code to Rust without breaking the ABI.
> Just have a look at the hello world examples such as:
> https://github.com/r-rust/hellorust or for a real world example:
> https://cran.r-project.org/package=gifski
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R, Rust and CRAN

2022-11-12 Thread Simon Urbanek
No, because I don't use Rust. That's why I was saying that the Rust users 
should get together and create a such package. Some of the packages listed have 
experience in fixing the problems, so I would hope they can provide guidance or 
a good starting point. This is something for the interested community to do.

Cheers,
Simon


> On Nov 13, 2022, at 9:20 AM, Dirk Eddelbuettel  wrote:
> 
> 
> On 13 November 2022 at 08:15, Simon Urbanek wrote:
> | sorry, I think you misunderstood: CRAN machines have the compilers, but the 
> packages were not detecting it properly and/or were violating the CRAN 
> policies as noted, so that's why I was saying that it would make sense to 
> have a unified approach so that each package author doesn't have to make the 
> same mistakes over again.
> 
> Would you be able to provide a sample package, say, "hello.rust", that does
> that to guide us all?
> 
> Dirk
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Resubmission of archived package after email change

2022-11-24 Thread Simon Urbanek
Have you followed the instructions?


> CRAN Repository Policy
> 
> Submission
> [...]
> Explain any change in the maintainer’s email address and if possible send 
> confirmation from the previous address (by a separate email to 
> cran-submissi...@r-project.org) or explain why it is not possible.


Cheers,
Simon



> On 24/11/2022, at 10:15 PM, ROTOLO, Federico /FR  
> wrote:
> 
> Dear all,
> I am maintainer of the parfm package.
> 
> I have changed affiliation in the last months and thus I do not have any 
> longer access to the email address that was mentioned in the package 
> DESCRIPTION.
> 
> Now, the package has been archived and my resubmission with a different email 
> address has been rejected.
> Do you have any hint?
> 
> Thank you in advance for any suggestion.
> Best
> 
> Federico Rotolo
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] LAPACK errors in r-devel only

2022-11-28 Thread Simon Urbanek
Louis,

you didn't provide any details (please always post a link to the sources you're 
talking about!), but I suspect you are missing character length arguments, 
Lapack.h has:

F77_NAME(dgeevx)(const char* balanc, const char* jobvl, const char* jobvr,
 const char* sense, const int* n, double* a, const int* lda,
 double* wr, double* wi, double* vl, const int* ldvl,
 double* vr, const int* ldvr, int* ilo, int* ihi,
 double* scale, double* abnrm, double* rconde, double* rcondv,
 double* work, const int* lwork, int* iwork, int* info
 FCLEN FCLEN FCLEN FCLEN);

23 = without lengths, 27 = with lengths, so presumably the R in question is 
compiled with USE_FC_LEN_T and thus your call is missing the corresponding 
FCONE entries - see R-ext 6.6.1 Fortran character strings.

Cheers,
Simon


> On 29/11/2022, at 12:36 PM, ASLETT, LOUIS J.M.  
> wrote:
> 
> I submitted a package update to CRAN in the hopes of reinstating an archived 
> package, {PhaseType}.  The update is mostly to remove a dependency on another 
> archived package, and to add registrations of C functions.
> 
> Everything is fine on r-release, but on r-devel I have errors.  I tested this 
> with r-hub prior to submission, but wondered if there were issues there 
> because the errors didn't make any sense, so I (perhaps wrongly) submitted to 
> CRAN and the errors have been repeated but again *only* for r-devel.
> 
> The full error log (so long as it lasts) is here: 
> https://win-builder.r-project.org/incoming_pretest/PhaseType_0.2.0_20221124_233702/
> 
> Essentially, the compile errors exclusively relate to LAPACK functions which 
> should be completely stable.  As far as I can tell (welcome any correction) 
> the r-devel build is expecting a different number of arguments for these 
> LAPACK functions than r-release, which honestly baffles me.
> 
> For example, I note the error:
> 
> #> PHT_MCMC_Aslett.c:180:157: error: too few arguments to function call, 
> expected 27, have 23
> #> F77_CALL(dgeevx)(&balanc, &jobvl, &jobvr, &sense, n, NULL, n, NULL, NULL, 
> NULL, n, NULL, n, NULL, NULL, NULL, NULL, NULL, NULL, &work, &lwork, NULL, 
> &info);
> 1721
> 
> However, the standard LAPACK interface documentation for dgeevx (see eg 
> https://netlib.org/lapack/explore-html/d9/d8e/group__double_g_eeigen_ga4e35e1d4e9b63ba9eef4ba8aff3debae.html
>  ) shows that the 23 arguments I am passing is correct (and has been for 
> years), not the 27 the r-devel build for some reason expects.
> 
> Any help greatly appreciated.  I've replied along these lines to CRAN 
> rejection message, but opening this question to the list in the hope of 
> assistance in understanding what's going on with r-devel on such a 
> bog-standard LAPACK function, which must have been stable for over a decade.
> 
> Thanks in anticipation!
> 
> Louis
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Minimum macOS version ?

2022-12-08 Thread Simon Urbanek
Dirk,

the minimum required version for the high-sierra build is 10.13 and for big-sur 
build is 11.0 (as the names imply). Although it is not unrealistic to move 
Intel to macOS 10.14, it would be more problematic to move to 10.15 since it is 
the version that killed 32-bit support so 10.14 is actually more prevalent than 
10.15 as a lot of users will never upgrade due to that loss. Given that Intel 
binaries are there only to support legacy hardware it's unlikely to move 
forward much more (i.e. I can see the immediate argument saying that those 
macOS version are unsupported, but so are the machines running them, but that 
doesn't mean R users won't use them).

What exactly do you need the higher macOS versions for? Generally, there were 
no major changes in the Mach-O format recently, so it shouldn't really matter - 
the main difference would be SDK/run-time - are there  specific features you 
require? Some of it could be addressed by providing more recent run-times 
separately - that's why I'm asking.

Cheers,
Simon


> On 9/12/2022, at 9:19 AM, Dirk Eddelbuettel  wrote:
> 
> 
> One package I stand behind as maintainer does
> 
>  ## Take care of 10.14 requirement for Intel macOS
>  if test x"${uname}" = x"Darwin" -a x"${machine}" = x"x86_64"; then
>  AC_MSG_CHECKING([for Darwin x86_64 use minimum version override])
>  CXX17_MACOS="-mmacosx-version-min=10.14"
>  AC_MSG_RESULT([${CXX17_MACOS}])
>  fi
> 
> We now contemplate going to 10.15 as minimum macOS version. Is that an issue
> for CRAN?
> 
> Apologies for the somewhat ignorant question but I was not aware where I
> could have looked this up. So thanks in advance for any pointers.
> 
> Thanks,  Dirk
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Minimum macOS version ?

2022-12-08 Thread Simon Urbanek
Dirk,

thanks, got it. If you have a work-around avoiding std::filesystem then can use 
something like

// no filesystem support before macOS Catalina
#if defined(__MAC_OS_X_VERSION_MIN_REQUIRED) && __MAC_OS_X_VERSION_MIN_REQUIRED 
< 101500
// no std::filesystem
#endif

(derived from https://github.com/nlohmann/json/pull/3101 - some of the comments 
there may be of use - it also mentions MinGW etc.)

The "offical" macros for macOS availability tests are AvailabilityMacros.h but 
they boil down to the above.

Cheers,
Simon


> On 9/12/2022, at 11:10 AM, Dirk Eddelbuettel  wrote:
> 
> 
> Simon,
> 
> On 9 December 2022 at 10:00, Simon Urbanek wrote:
> | the minimum required version for the high-sierra build is 10.13 and for 
> big-sur build is 11.0 (as the names imply). Although it is not unrealistic to 
> move Intel to macOS 10.14, it would be more problematic to move to 10.15 
> since it is the version that killed 32-bit support so 10.14 is actually more 
> prevalent than 10.15 as a lot of users will never upgrade due to that loss. 
> Given that Intel binaries are there only to support legacy hardware it's 
> unlikely to move forward much more (i.e. I can see the immediate argument 
> saying that those macOS version are unsupported, but so are the machines 
> running them, but that doesn't mean R users won't use them).
> | 
> | What exactly do you need the higher macOS versions for? Generally, there 
> were no major changes in the Mach-O format recently, so it shouldn't really 
> matter - the main difference would be SDK/run-time - are there  specific 
> features you require? Some of it could be addressed by providing more recent 
> run-times separately - that's why I'm asking.
> 
> In this case: std::filesystem which is from C++20 and errors on macOS.
> 
> We will #ifdef it out of existence on that platform but use it on other
> platforms where we can.
> 
> Thaks, Dirk
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] NOTE about use of `:::`

2022-12-14 Thread Simon Urbanek
David,

why not

call[[1]] <- parse_args

The assignment is evaluated in your namespace so that makes sure the call is 
that of your function. The only downside I see is that in a stack trace you'll 
see the definition instead of the name.
Or possibly

do.call(parse_args, as.list(call[-1]))

Cheers,
Simon

PS: Note that ::: is expensive - it probably doesn't matter here, but would in 
repeatedly called functions.


> On 15/12/2022, at 12:19 PM, David Kepplinger  
> wrote:
> 
> Dear List,
> 
> I am working on updating the pense package and refactored some of the
> methods. I have several functions which take the same arguments, hence I'm
> sending all these arguments to an internal function, called `parse_args()`.
> Since I want to evaluate the arguments in the caller's environment, I'm
> using the following code
> 
>  call <- match.call(expand.dots = TRUE)
>  call[[1]] <- quote(pense:::parse_args)
>  args <- eval.parent(call)
> 
> Of course, R CMD CHECK complains about the use of `:::`, as it's almost
> never needed. I think the above usage would fall into that area of
> "almost", but I'm not sure if (a) there's a better approach and (b) the
> CRAN team would agree with me. I would have to test (b) by submitting and
> working with the CRAN team, but I wanted to ask the list first to see if
> I'm missing something obvious. I don't want to export the function
> parse_args() as it's not useful for a user, and the use is truly internal.
> 
> Thanks and all the best,
> David
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] If you had to choose one binary to preserve for a pkg, which would it be?

2023-01-24 Thread Simon Urbanek
Uri,

I can speak only for macOS package binaries and they have been rarely re-built. 
The only time when a re-build is necessary is when a dependency is updated and 
breaks its backward-compatibility (sadly, yes, that happens). It is relatively 
rare, but recently Matrix was one example with reasonably big fall-out. Those 
things are likely to happen more often in the future, but if you are mainly 
interested in an archive then you should be able to simply go by modification 
dates for the macOS binaries. However, I would add a black-out period after a 
major R release, because what happens is that I do a full re-build of all 
packages after a major R release (up til then the packages are build against 
the beta/RC) and that can take up to a few days, so I wouldn't keep packages 
built before that first set is done which can be few days after the release.

Cheers,
Simon


> On Jan 23, 2023, at 10:36 AM, Uri Simonsohn  wrote:
> 
> This is not a perfect list for this question, but possibly a good list.
> 
> I maintain 'groundhog', a package that seeks to simplify reproducibility 
> of R code based on R packages.
> It has so far relied on MRAN  for binaries of older/archived versions of 
> packages, but MRAN is shutting down.
> Posit (R Studio) also has archived binaries, but they are less 
> transparent about it,  they do not have Mac binaries, and I am a little 
> uncomfortable relying on a 3rd party again, specially because their 
> archive is more difficult to navigate and this is part of a for-profit 
> venture so access is far from guaranteed. So...
> 
> I will create an independent archive of all binaries for packages for 
> Windows and Mac machines.
> 
> Instead of having daily backups like MRAN does/did, i will keep just one 
> binary per combination of package, version, R version, operating system.
> So a single 'rio' 0.5.0 binary for Windows for R-4.2.x, for example 
> (MRAN keeps a daily copy of such file instead, possibly with 100+ 
> identical or nearly identical copies).
> 
> I need to decide whether to keep the first binary that was uploaded to 
> CRAN, the last one, or one in the middle, etc.
> In  concept binaries should work regardless of which file is chosen, but 
> there is a reason, i guess they are rebuilt so often so it may make a 
> difference in the margin which of the many builts available in MRAN is 
> chosen to be preserved. I think it has to do with changes in underlying 
> packages used to build them, but am not sure.
> This decision will also guide future archiving, which of the many 
> versions of to be uploaded to CRAN binaries are preserved.
> 
> So, if you have experience or knowledge on this, which of the many 
> previously created binaries for a given package version would you choose 
> to archive long-term?
> Groundhog will always attempt to install from source if a binary fails, 
> so a certain error rate is tolerable.
> 
> Uri
> 
> --
> 
> Uri Simonsohn (urisohn.com)
> 
> Professor of Behavioral Science, ESADE, Barcelona
> 
> Senior Fellow, Wharton School, University of Pennsylvania
> 
> Blog at:  DataColada.org 
> 
> Easy data sharing: ResearchBox.org
> 
> Twitter: @uri_sohn
> 
> 
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Best current way to hook into the event loop?

2023-02-02 Thread Simon Urbanek
Duncan,

I don't know if it is best, but you can have a look at "background"[1] which is 
I believe what "later" was inspired by. It is a very minimal example so should 
give you ideas on how to do that in your package - it runs the R code on the 
main thread so it should be as close to safe as one can get with asynchronous 
calls, just beware of reentrance. In your case I suspect that you may already 
have an X11 fd that you can use in the unix handler - not sure what signaling 
you need on Windows, though.

Cheers,
Simon

[1] - https://github.com/s-u/background


> On Feb 3, 2023, at 10:53 AM, Duncan Murdoch  wrote:
> 
> I'm updating low level stuff in the rgl package.  I'm exploring using the 
> GLFW library to handle low level stuff instead of trying to do that myself.
> 
> Currently rgl has fairly ugly code to link into the R event loop.  (It needs 
> to do this so that it hears about mouse movement, etc.)  I'm hoping that 
> someone else has written better code than I could to do this.  Is there a 
> currently recommended way to hook into the loop?
> 
> The kinds of things I need are getting events fairly frequently (e.g. 0.03 
> second delay if things aren't too busy) for code that doesn't involve R at 
> all.
> 
> I will also occasionally want to call back into R; I don't really mind if 
> there's a bigger delay to wait for that to be safe.
> 
> I've seen the "later" package:  it looks as though it might do what I need, 
> but the protection for R code seems to be all or nothing, i.e. if I want to 
> evaluate an R expression nothing else can be running, or if I want action 
> while R code is running, I can't involve R at all.
> 
> Duncan Murdoch
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R OpenCL on an AMD GPU

2023-02-07 Thread Simon Urbanek
Quirin,

this is a contributed package question, so you should either use the GitHub 
issues (https://github.com/s-u/OpenCL/issues) for the package or contact the 
maintainer (me). But before you do so, you have to provide a lot more details 
including exact code you used and the full output of the errors you saw. 
Finally, on Windows, make sure you refer to the instructions posted here to 
pre-load your run-time DLL: 
https://github.com/s-u/OpenCL/issues/6#issuecomment-899114747

Cheers,
Simon


> On Feb 8, 2023, at 5:08 AM, Quirin Stier  wrote:
> 
> Hi,
> 
> I am trying to set up the CRAN package "OpenCL" to run on a Windows 10
> machine with an AMD GPU (Radeon 7900 XTX).
> 
> I installed the AMD drivers (which carry the OpenCL.dll) and the
> installation of OpenCL in R didnt work. So I also downloaded the AMD SDK
> to set up the environment variables OCL64LIB, OCL32LIB and OCLINC. The
> installation worked, but the built package was broke (OpenCL kernels
> didnt execute due to a multiplicity of errors - the OpenCL set up must
> have been wrong).
> 
> The installation of OpenCL was done with install.packages("OpenCL",
> INSTALL_opts = "--no-test-load") trying it with and without the
> install_opts option.
> 
> Does anyone have a clue?
> 
> Best regards,
> 
> Quirin
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Sanitize Input Code for a Shiny App

2023-02-26 Thread Simon Urbanek
Bill,

the short answer is you can't limit anything at R level. Any attempts to create 
a list of "bad" commands are trivial to circumvent since you can compute on the 
language in R, so you can construct and call functions with trivial operations. 
Similarly, since R allows the loading of binary code the same applies for 
arbitrary code. So if you give users access to R, you should assume that is 
equivalent to allowing arbitrary code execution. Therefore all you can do is 
limit the resources and reach - as you pointed out using a container is a good 
idea so each session is only limited to a single container that goes away when 
the session ends. Similarly you can restrict main parts of R and the system to 
be read-only in the container.

In practice, that's why real analytic systems are about provenance rather than 
prevention. For example, in RCloud all code is first committed to a git 
repository outside of the container before it can be executed, so malicious 
users can do whatever they want, but they cannot hide the malicious code they 
used as the container cannot manipulate the history.

As for package installation - again, it's impossible to prevent it in general 
unless you make everything read-only which also prevents the users from doing 
meaningful work. So the real question what do you want to allow the user to do 
- why would you need to allow literal R code evaluation? The other alternative 
is to simply limit the interaction not allowing the user to submit arbitrary 
code, only tweak parameters or use GUI to select particular choices. Obviously, 
that is a lot easier to secure.

Cheers,
Simon


> On 27/02/2023, at 8:36 AM, b...@denney.ws wrote:
> 
> Hello,
> 
> 
> 
> I'm working to develop a Shiny app where I'd like to have an advanced
> capability to accept user input and run the code.  For the code received,
> I'd like to be able to prevent R from doing things other than working within
> the R session.  For example, I want to prevent `system("rm -rf /*")`.
> 
> 
> 
> One method to achieve this is to run the R session within a Docker container
> and perform the security around the container.  The user could do some
> things within the container, but they would be limited.
> 
> 
> 
> What I'd like to be able to do is to sanitize the inputs to ensure that it
> won't to things including installing packages, running system commands,
> reading and writing to the filesystem, and accessing the network.  I'd like
> to allow the user to do almost anything they want within R, so making a list
> of acceptable commands is not accomplishing the goal.  I could try to do
> something like:
> 
> 
> 
> * have acceptable packages loaded, only,
> * don't allow loading additional packages,
> * deny a set of known-bad commands (e.g. system, system2, etc.)
> * deny any attempt to run from additional packages (exclude calls with
> a double-colon or triple-colon)
> 
> 
> 
> The method I just described seems like it would not work well because it
> assumes that the known-bad commands is comprehensive and that I'm being
> creative enough in ways that users could try to break things.
> 
> 
> 
> Is there a good way to sanitize arbitrary code from users to prevent
> malicious behavior?
> 
> 
> Thanks,
> 
> 
> 
> Bill
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] How to declare Bioconductor Dependencies in the Description File of my R Package

2023-03-17 Thread Simon Urbanek
Packages can only be installed from the repositories listed and only CRAN is 
the default so only CRAN package are guaranteed to work. I'd like to add that 
the issue below is exactly why, personally, I would not recommend using 
Bioconductor package as strong dependency (imports/depends), because that makes 
the package unusable for most users, as it cannot be installed (without extra 
steps they don't know about) since the dependency doesn't exist on CRAN.

If your users are already Bioconductor users by the virtue of the package 
application then they already know so it's fine, but then you are probably 
better off to have your package on Bioconductor as part of the ecosystem which 
is much more streamlined and coordinated.

If it is only suggested (weak dependency) for some optional functionality, then 
your package will work even if the dependency is not installed so all is well. 
And if the optional Bioconductor functionality is used you can direct the user 
to instructions explaining that Bioconductor is required for that - but the 
package has to do that, it is not anything automatic in R.

Cheers,
Simon


> On Mar 18, 2023, at 1:29 AM, Ruff, Sergej  
> wrote:
> 
> Really.Whats a problem i have when all dependencies arent prei installed. I 
> thought the problem would be solved once my package is available on CRAN.
> 
> 
> Here is a recent question I had regarding the same issue:
> 
> 
> I am currently working on a r-package. I would like to submit my r package to 
> CRAN, but I have a question regarding dependency installations on CRAN.
> 
> I have almost finished my package, added the required dependencies to the 
> NAMESPACE and DESCRIPTION files as Imports, and get no errors or warnings
> 
> when running the check in Rstudio. The package runs on the pc, where I´ve 
> built the package, but when I install the package on a pc, where the 
> dependencies
> 
> are not preinstalled, I get the following error:
> 
> ERROR:
> 
> dependencies 'depth', 'geometry' are not available for package 'packagename'
> * removing 
> 'C:/Users/156932/AppData/Local/Programs/R/R-4.2.1/library/packagename'
> Warning in install.packages : installation of package ‘packagename’ had 
> non-zero exit status
> 
> 
> The problem is that a local installation of my package (via USB-stick for 
> example) can´t install the dependencies from CRAN.
> 
> The package works perfectly fine, if the dependencies are preinstalled.
> 
> Now I don´t want to submit my package to CRAN if the end user gets the same 
> error message when installing my package.
> 
> Question: After I submit my package to CRAN, will CRAN install dependencies 
> automatically (via "install.packages()"), resolving the issue I have right 
> now?
> 
> Or do I have to modify the R-package or the Description-file to make sure my 
> Package can install dependencies?
> 
> I provided the dependencies to the NAMESPACE-file as @ImportFrom via the 
> devtools::document()-function. I added the dependencies to the 
> DESCRIPTION-file via usethis::use_package("x",type="Imports"). The 
> Description looks like this:
> 
> License: GPL (>= 3)
> Encoding: UTF-8
> LazyData: true
> RoxygenNote: 7.2.3
> Imports:
>depth,
>geometry,
>graphics,
>grDevices,
>MASS,
>mvtnorm,
>nlme,
>rgl,
>stats
> 
> 
> 
> So I thought all dependencies would install automatically from CRAN? Is that 
> not the case?
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Discovering M1mac cowpads

2023-03-24 Thread Simon Urbanek
John,

you provide no details to go on, but generally the main difference is that 
arm64 uses 64-bit precision for long double (which is permitted by the C 
standard), while Intel uses 80-bits of precision (on systems that enable it). 
That leads to differences in results, e.g. when computing long sums:

set.seed(1); x=rnorm(1e6)

## Intel with extended precision
> sprintf("%a", sum(x))
[1] "0x1.7743176e2372bp+5"

## arm64
> sprintf("%a", sum(x))
[1] "0x1.7743176e23a33p+5"


For R you can get the same results on all platforms by using 
--disable-long-double which prevents the use of extended precision doubles in R 
- this is Intel with --disable-long-double:

> sprintf("%a", sum(x))
[1] "0x1.7743176e23a33p+5"

Cheers,
Simon



> On Mar 25, 2023, at 8:03 AM, J C Nash  wrote:
> 
> Recently I updated my package nlsr and it passed all the usual checks and was
> uploaded to CRAN. A few days later I got a message that I should "fix" my
> package as it had failed in "M1max" tests.
> 
> The "error" was actually a failure in a DIFFERENT package that was used as
> an example in a vignette. I fixed it in my vignette with try(). However, I
> am interested in just where the M1 causes trouble.
> 
> As far as I can determine so far, for numerical computations, differences will
> show up only when a package is able to take advantage of extended precision
> registers in the IEEE arithmetic. I think this means that in pure R, it won't
> be seen. Packages that call C or Fortran could do so. However, I've not yet
> got a good handle on this.
> 
> Does anyone have some small, reproducible examples? (For me, reproducing so
> far means making a small package and submitting to macbuilder, as I don't
> have an M1 Mac.)
> 
> Cheers,
> 
> John Nash
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] 'Default' macos (x86) download URL now gone?

2023-04-23 Thread Simon Urbanek
Dirk,

thanks - the problem is that there is not a single installer package (for 
several years now), so that URL is ambiguous. Whether the missing link is a 
good or bad depends on how it is used. I would argue that any link to that URL 
is inherently bad, because there is no way of knowing that the link works for a 
particular system - that's why I have originally removed it with the R 4.3.0 
release. I have restored it now, making it point to the R 4.3.0 arm64 release 
since that is arguably the closest to a single "latest R". R releases have not 
been stored in /bin/macosx since 2015, so anyone using a link there is asking 
for trouble.

For any CI I would strongly recommend using the "last-success" links: 
https://mac.r-project.org/big-sur/last-success/ in particular the .xz versions 
are they are specifically designed to be used by CI (small download, fast and 
localized install).

Cheers,
Simon


> On 24/04/2023, at 3:25 AM, Dirk Eddelbuettel  wrote:
> 
> 
> The URL ${CRAN}/bin/macosx/R-latest.pkg is in fairlt widespread use. A quick
> Google query [1] reveals about 1.1k hits. And it happens to be used too in a
> CI job a colleague noticed failing yesterday.
> 
> The bin/macosx/ page now prominently displays both leading flavours
>  R-4.3.0-arm64.pkg
>  R-4.3.0-x86_64.pkg
> which makes sense give the architecture choices. We can of course update the
> CI script, and likely will.
> 
> But given that this was apparently a somewhat widely-used URL to fetch R on
> macOS, may I suggest that the convenience link be reestablished as a courtesy?
> 
> Best,  Dirk
> 
> https://github.com/search?q=macosx%2FR-latest.pkg&type=code
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] How to declare Bioconductor Dependencies in the Description File of my R Package

2023-05-03 Thread Simon Urbanek



> On May 4, 2023, at 3:36 AM, Martin Morgan  wrote:
> 
> CRAN is fine with Bioconductor Depends: and Imports: dependencies, as 
> previously mentioned. This is because the CRAN maintainers explicitly 
> configure their system to know about Bioconductor package repositories.
> 

That is not exactly true (at least not for all maintainers ;)). Bioconductor 
packages are installed on as-needed (best-effort) basis and it is a manual 
process. Ideally, Bioconductor packages would be in Suggests, because if they 
are not, the package binary will be effectively broken for most users as they 
cannot install it without additional steps (and no stable state can be 
guaranteed, either). That's why I believe someone was suggesting a pre-flight 
check that alerts the user to such situation and prints instructions to remedy 
it (e.g., to use setRepositories()) as the majority of users will have no idea 
what's going on.

Cheers,
Simon



> Users face a different challenge -- many users will not have identified 
> (e.g., via `setRepositories()` a Bioconductor repository, so when they try to 
> install your package it will fail in a way that you have no control over -- a 
> generic message saying that the Bioconductor dependencies was not found.
> 
> You could mitigate this by advertising that your CRAN package should be 
> installed via `BiocManager::install("")`, which defines 
> appropriate repositories for both CRAN and Bioconductor, but there is no way 
> to unambiguously communicate this to users.
> 
> Martin
> 
> From: R-package-devel  on behalf of 
> Ruff, Sergej 
> Date: Wednesday, May 3, 2023 at 11:13 AM
> To: Dirk Eddelbuettel 
> Cc: r-package-devel@r-project.org 
> Subject: Re: [R-pkg-devel] How to declare Bioconductor Dependencies in the 
> Description File of my R Package
> Thank you, Dirk.
> 
> 
> I see your dependencies are Suggested. I know that Suggested dependencies 
> should be conditional.
> 
> 
> Do you know if Non-Cran (Bioconductor) packages need to be conditional?  Do 
> you have any experiece regarding Non-CRAN Dependencies
> 
> and how to handle them?
> 
> 
> I believe Duncan Murdoch's experience and opinion regarding that topic, but i 
> take any second and third opinion to be sure.
> 
> 
> Thank you for your help.
> 
> 
> Sergej
> 
> 
> Von: Dirk Eddelbuettel 
> Gesendet: Mittwoch, 3. Mai 2023 16:22:09
> An: Ruff, Sergej
> Cc: Duncan Murdoch; Ivan Krylov; r-package-devel@r-project.org
> Betreff: Re: [R-pkg-devel] How to declare Bioconductor Dependencies in the 
> Description File of my R Package
> 
> 
> Sergej,
> 
> Please consider:
> 
>  - there are nearly 20k CRAN packages
> 
>  - all of them are mirrored at https://github.com/cran so you can browse
> 
>  - pick any one 'heavy' package you like, Seurat is a good example; there
>are other examples in geospatial or bioinformatics etc
> 
>  - you can browse _and search_ these to your hearts content
> 
> Here is an example of mine. In RcppArmadillo, years ago we (thanks to fine
> Google Summer of Code work by Binxiang Ni) added extended support for sparse
> matrices pass-through / conversione from R to C++ / Armadillo and back. That
> is clearly an optional feature as most uses of (Rcpp)Armadillo use dense
> matrices. So all code and test code is _conditional_.  File DESCRIPTION has
> 
>   Suggests: [...], Matrix (>= 1.3.0), [...], reticulate, slam
> 
> mostly for tests. I.e. We have very little R code: in one single file
> R/SciPy2R.R we switched to doing this via reticulate and opee the function
> with
> 
>if (!requireNamespace("reticulate", quietly=TRUE)) {
>stop("You must install the 'reticulate' package (and have SciPy).", 
> call.=FALSE)
>}
> 
> after an actual deprecation warning (as there was scipy converter once).
> 
> Similarly, the testsuites in inst/tinytests/* have several
> 
>if (!requireNamespace("Matrix", quietly=TRUE)) exit_file("No Matrix 
> package")
> 
> as well as
> 
>if (!requireNamespace("reticulate", quietly=TRUE)) exit_file("Package 
> reticulate missing")
> 
>if (!packageVersion("reticulate") >= package_version("1.14"))
>exit_file("SciPy not needed on newer reticulate")
> 
> and tests for slam (another sparse matrix package besides the functionality
> in Matrix).
> 
> Hopefully this brief snapshot gives you an idea.  There are (likely!!)
> thousandss of examples you can browse, and I am sure you will find something.
> If you have further (concrete) questions please do not hesitate to use the
> resource of this list.
> 
> Cheers (or I should say "mit Braunschweiger Gruessen nach Hannover),
> 
> Dirk
> 
> --
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 
>[[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 
>   [[alternative HTML version deleted]]
> 
> __

Re: [R-pkg-devel] Unfortunate function name generic.something

2023-05-08 Thread Simon Urbanek



> On 8/05/2023, at 11:58 PM, Duncan Murdoch  wrote:
> 
> There really isn't such a thing as "a function that looks like an S3 method, 
> but isn't".  If it looks like an S3 method, then in the proper circumstances, 
> it will be called as one.
> 


I disagree - that was the case in old versions, but not anymore. The whole 
point of introducing namespaces and method registration was to make it clear 
when a function is a method and when it is a function. If you export a function 
it won't be treated as a method:

In a package NAMESPACE:
export(foo.cls)
package R code: foo.cls <- function(x) "foo.cls"

in R:
> cls=structure(1,class="cls")
> foo=function(x) UseMethod("foo")
> foo(cls)
Error in UseMethod("foo") : 
  no applicable method for 'foo' applied to an object of class "cls"
> foo.cls(cls)
[1] "foo.cls"

So R knows very well what is a method and what is a function. If you wanted it 
to be a method, you have to use S3method(foo, cls) and that **is** different 
from export(foo.cls) - quite deliberately so.

Cheers,
Simon


> In your case the function name is levels.no, and it isn't exported.  So if 
> you happen to have an object with a class inheriting from "no", and you call 
> levels() on it, levels.no might be called.
> 
> This will only affect users of your package indirectly.  If they have objects 
> inheriting from "no" and call levels() on them, levels.no will not be called. 
>  But if they pass such an object to one of your package functions, and that 
> function calls levels() on it, they could end up calling levels.no().  It all 
> depends on what other classes that object inherits from.
> 
> You can test this yourself.  Set debugging on any one of your functions, then 
> call it in the normal way.  Then while still in the debugger set debugging on 
> levels.no, and create an object using
> 
>  x <- structure(1, class = "no")
> 
> and call levels(x).  You should break to the code of levels.no.
> 
> That is why the WRE manual says "First, a caveat: a function named gen.cl 
> will be invoked by the generic gen for class cl, so do not name functions in 
> this style unless they are intended to be methods."
> 
> So probably the best solution (even if inconvenient) is to rename levels.no 
> to something that doesn't look like an S3 method.
> 
> Duncan Murdoch
> 
> On 08/05/2023 5:50 a.m., Ulrike Groemping wrote:
>> Thank your for the solution attempt. However, using the keyword internal
>> does not solve the problem, the note is still there. Any other proposals
>> for properly documenting a function that looks like an S3 method, but isn't?
>> Best, Ulrike
>> Am 05.05.2023 um 12:56 schrieb Iris Simmons:
>>> You can add
>>> 
>>> \keyword{internal}
>>> 
>>> to the Rd file. Your documentation won't show up the in the pdf
>>> manual, it won't show up in the package index, but you'll still be
>>> able to access the doc page with ?levels.no  or
>>> help("levels.no ").
>>> 
>>> This is usually used in a package's deprecated and defunct doc pages,
>>> but you can use it anywhere.
>>> 
>>> On Fri, May 5, 2023, 06:49 Ulrike Groemping
>>>  wrote:
>>> 
>>> Dear package developeRs,
>>> 
>>> I am working on fixing some notes regarding package DoE.base.
>>> One note refers to the function levels.no  and
>>> complains that the
>>> function is not documented as a method for the generic function
>>> levels.
>>> Actually, it is not a method for the generic levels, but a standalone
>>> internal function that I like to have documented.
>>> 
>>> Is there a way to document the function without renaming it and
>>> without
>>> triggering a note about method documentation?
>>> 
>>> Best, Ulrike
>>> 
>>> --
>>> ##
>>> ## Prof. Ulrike Groemping
>>> ## FB II
>>> ## Berliner Hochschule für Technik (BHT)
>>> ##
>>> ## prof.bht-berlin.de/groemping 
>>> ## Phone: +49(0)30 4504 5127
>>> ## Fax:   +49(0)30 4504 66 5127
>>> ## Home office: +49(0)30 394 04 863
>>> ##
>>> 
>>> __
>>> R-package-devel@r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>>> 
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Unfortunate function name generic.something

2023-05-09 Thread Simon Urbanek
Duncan,

you're right that any functions in the call environment are always treated as 
methods (even before consulting method registrations). That is a special case - 
I presume for compatibility with the world before namespaces so that, e.g., you 
don't have to register methods in the global environment when working 
interactively. I wonder if that is something that packages could choose to opt 
out of for safety since they are already relying on method registration (and 
that would also in theory improve performance).

One interesting related issue is that in the current implementation of the 
method registration there is no concept of "private" methods (which is what the 
above rule effectively provides) since methods get registered with the generic, 
so they are either visible to everyone or not at all. If one would really want 
to support this, it would require a kind of "local" registration and then 
replacing the name-based search up the call chain with local registration 
search - but probably again at the cost of performance.

Cheers,
Simon


> On May 9, 2023, at 11:23 AM, Duncan Murdoch  wrote:
> 
> On 08/05/2023 6:58 p.m., Simon Urbanek wrote:
>>> On 8/05/2023, at 11:58 PM, Duncan Murdoch  wrote:
>>> 
>>> There really isn't such a thing as "a function that looks like an S3 
>>> method, but isn't".  If it looks like an S3 method, then in the proper 
>>> circumstances, it will be called as one.
>>> 
>> I disagree - that was the case in old versions, but not anymore. The whole 
>> point of introducing namespaces and method registration was to make it clear 
>> when a function is a method and when it is a function. If you export a 
>> function it won't be treated as a method:
>> In a package NAMESPACE:
>> export(foo.cls)
>> package R code: foo.cls <- function(x) "foo.cls"
>> in R:
>>> cls=structure(1,class="cls")
>>> foo=function(x) UseMethod("foo")
>>> foo(cls)
>> Error in UseMethod("foo") :
>>   no applicable method for 'foo' applied to an object of class "cls"
>>> foo.cls(cls)
>> [1] "foo.cls"
>> So R knows very well what is a method and what is a function. If you wanted 
>> it to be a method, you have to use S3method(foo, cls) and that **is** 
>> different from export(foo.cls) - quite deliberately so.
> 
> That is true for package users, but it's not true within the package.  I just 
> tested this code in a package:
> 
>  levels.no <- function(xx, ...) {
>stop("not a method")
>  }
> 
>  f <- function() {
>x <- structure(1, class = "no")
>levels(x)
>  }
> 
> Both levels.no and f were exported.  If I attach the package and call f(), I 
> get the error
> 
>  > library(testpkg)
>  > f()
>  Error in levels.no(x) : not a method
> 
> because levels.no is being treated as a method when levels() is called in the 
> package.
> 
> If I create an x like that outside of the package and call levels(x) there, I 
> get NULL, because levels.no is not being treated as a method in that context.
> 
> As far as I know, there is no possible way to have a function in a package 
> that is called "levels.no" and not being treated as a method within the 
> package.  I don't think there's any way to declare "this is not a method", 
> other than naming it differently.
> 
> Duncan
> 
>> Cheers,
>> Simon
>>> In your case the function name is levels.no, and it isn't exported.  So if 
>>> you happen to have an object with a class inheriting from "no", and you 
>>> call levels() on it, levels.no might be called.
>>> 
>>> This will only affect users of your package indirectly.  If they have 
>>> objects inheriting from "no" and call levels() on them, levels.no will not 
>>> be called.  But if they pass such an object to one of your package 
>>> functions, and that function calls levels() on it, they could end up 
>>> calling levels.no().  It all depends on what other classes that object 
>>> inherits from.
>>> 
>>> You can test this yourself.  Set debugging on any one of your functions, 
>>> then call it in the normal way.  Then while still in the debugger set 
>>> debugging on levels.no, and create an object using
>>> 
>>>  x <- structure(1, class = "no")
>>> 
>>> and call levels(x).  You should break to the code of levels.no.
>>> 
>>> That is why the WRE manual says "First, a caveat: a function named gen.

Re: [R-pkg-devel] Please install cmake on macOS builders

2023-05-10 Thread Simon Urbanek
Dirk,

can you be more specific, please? I suspect that it may be rather an issue in 
your package. All build machines have the official cmake releases installed and 
there are many packages that use it successfully. Here is the report on the 
currently installed versions. If you require more recent version, let me know.

high-sierra-x86_64$ /Applications/CMake.app/Contents/bin/cmake --version | head 
-n1
cmake version 3.17.3

big-sur-arm64$ /Applications/CMake.app/Contents/bin/cmake --version | head -n1
cmake version 3.19.4

mac-builder-arm64$ /Applications/CMake.app/Contents/bin/cmake --version | head 
-n1
cmake version 3.21.2

big-sur-x86_64$ /Applications/CMake.app/Contents/bin/cmake --version | head -n1
cmake version 3.26.0

Cheers,
Simon


> On May 11, 2023, at 12:01 AM, Dirk Eddelbuettel  wrote:
> 
> 
> Simon,
> 
> Explicitly declaring
> 
>SystemRequirements: cmake
> 
> appears to be insufficient to get a build on the (otherwise lovely to have)
> 'macOS builder', and leads to failure on (at least) 'r-oldrel-macos-x86_64'.
> 
> Would it be possible to actually have cmake installed?
> 
> These daus cmake is for better or worse becoming a standard, and I rely on it
> for one (new) package to correctly configure a library. It would be nice to
> be able to rely on it on macOS too.
> 
> Thanks,  Dirk
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Please install cmake on macOS builders

2023-05-11 Thread Simon Urbanek
I think it would be quite useful to have some community repository of code 
snippets dealing with such situations. R-exts gives advice and pieces of code 
which are useful, but they are not complete solutions and situations like 
Dirk's example are not that uncommon. (E.g., I recall some of the spatial 
packages copy/pasting code from each other for quote some time - which works, 
but is error prone if changes need to be made).

If one has to rely on a 3rd party library and one wants to fall back to source 
compilation when it is not available, it is a quite complex task, because one 
has to match the library's build system to R's and the package build rules as 
well. There are many ways where this can go wrong - Dirk mentioned some of them 
- and ideally not every package developer in that situation should be going 
through the pain of learning all the details the hard way.

Of course there are other packages as an example, but for someone not familiar 
with the details it's hard to see which ones do it right, and which ones don't 
- we don't always catch all the bad cases on CRAN.

I don't have a specific proposal, but if there was a GitHub repo or wiki or 
something to try to distill the useful bits from existing packages, I'd be 
happy to review it and give advice based on my experience from that macOS 
binary maintenance if that's useful.

Cheers,
Simon


> On May 12, 2023, at 8:36 AM, Dirk Eddelbuettel  wrote:
> 
> 
> Hi Reed,
> 
> On 11 May 2023 at 11:15, Reed A. Cartwright wrote:
> | I'm curious why you chose to call cmake from make instead of from configure.
> | I've always seen cmake as part of the configure step of package building.
> 
> Great question! Couple of small answers: i) This started as a 'proof of
> concept' that aimed to be small so getting by without requiring `configure`
> seemed worth a try, ii) I had seen another src/Makevars invoking compilation
> of a static library in a similar (albeit non-cmake) way and iii) as we now
> know about section 1.2.6 (or soon 1.2.9) 'Using cmake' has it that way too.
> 
> Otherwise I quite like having `configure` and I frequently use it -- made
> from 'genuine' configire.in via `autoconf`, or as scripts in shell or other
> languages.
> 
> Cheers, Dirk
> 
> PS My repaired package is now on CRAN. I managed to bungle the static library
> build (by not telling `cmake` to use position independent code), bungled
> macOS but not telling myself where `cmake` could live, and in fixing that
> bungled Windows by forgetting to add `src/Makevars.win` fallback. Yay me.
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Are the CRAN macOS builders down?

2023-05-16 Thread Simon Urbanek
Dirk,

builds are immediate, so it is a matter of seconds for most packages. I don't 
see any issues on the Mac Builder server.
If you have a problem, please be more specific and include the check link 
returned at submission.

Cheers,
Simon
 

> On 17/05/2023, at 4:27 AM, Dirk Eddelbuettel  wrote:
> 
> 
> Simon,
> 
> As a follow-up to the cmake questions (and me now knowing I have to tell R
> where cmake is on macOS), I uploaded a new package last Thursday. It has long
> built everywhere on CRAN, but not on macOS.  Ditto for another package update
> from Sunday (RcppSimdJson) which also has not been touched.
> 
> Should I adjust my expectations that this can take a week or longer on macOS,
> or did a few builds fall off the wagon?
> 
> Thanks as always for looking after that architecture.
> 
> Best,  Dirk
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Are the CRAN macOS builders down?

2023-05-16 Thread Simon Urbanek
Dirk,

thanks, ok, now I get what you meant. This has nothing to do with CRAN uploads 
(which are handled in Vienna) this was about specific macOS builds. The arm64 
Big Sur build machine had apparently issues. I have re-started the arm64 builds 
so they should catch up in a few hours.

Thanks,
Simon


> On 17/05/2023, at 8:39 AM, Dirk Eddelbuettel  wrote:
> 
> 
> Simon:
> 
> On 17 May 2023 at 07:57, Simon Urbanek wrote:
> | builds are immediate, so it is a matter of seconds for most packages. I 
> don't see any issues on the Mac Builder server.
> | If you have a problem, please be more specific and include the check link 
> returned at submission.
> 
> I was talking about _CRAN uploads_. To be as specific as you asked:
> 
> - crc32c on CRAN since May 11, all systems apart from macOS built but all
>   macOS builds missing
>   https://cran.r-project.org/web/checks/check_results_crc32c.html
> 
> - RcppSimdJson on CRAN since May 14, six linux + windows builds made, two
>   linux builds and all macOS missing
>   https://cran.r-project.org/web/checks/check_results_RcppSimdJson.html
> 
> So no builds on macOS for either of my uploads to CRAN. Can you comment?
> 
> Dirk
> 
> | Cheers,
> | Simon
> |  
> | 
> | > On 17/05/2023, at 4:27 AM, Dirk Eddelbuettel  wrote:
> | > 
> | > 
> | > Simon,
> | > 
> | > As a follow-up to the cmake questions (and me now knowing I have to tell R
> | > where cmake is on macOS), I uploaded a new package last Thursday. It has 
> long
> | > built everywhere on CRAN, but not on macOS.  Ditto for another package 
> update
> | > from Sunday (RcppSimdJson) which also has not been touched.
> | > 
> | > Should I adjust my expectations that this can take a week or longer on 
> macOS,
> | > or did a few builds fall off the wagon?
> | > 
> | > Thanks as always for looking after that architecture.
> | > 
> | > Best,  Dirk
> | > 
> | > -- 
> | > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> | > 
> | 
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Are the CRAN macOS builders down?

2023-05-17 Thread Simon Urbanek
Everything should be catched up by now
https://mac.r-project.org/reports/status.html

FWIW the macOS builds are simply run off published CRAN, so that's why the the 
sequence is upload -> incoming-tests -> CRAN-src -> check+build -> macos-master 
-> CRAN-bin where each step may involve cron jobs which is why the total time 
from upload to published binary can take a bit of time.

Cheers,
Simon



> On 17/05/2023, at 10:55 AM, Dirk Eddelbuettel  wrote:
> 
> 
> On 17 May 2023 at 10:39, Simon Urbanek wrote:
> | Dirk,
> | 
> | thanks, ok, now I get what you meant. This has nothing to do with CRAN 
> uploads (which are handled in Vienna) this was about specific macOS builds. 
> The arm64 Big Sur build machine had apparently issues. I have re-started the 
> arm64 builds so they should catch up in a few hours.
> 
> Thanks but as I noted _all other non-arm64 macOS machines are also lagging_
> and now for about five days -- which is why wrote the email.
> 
> But good to know you are on it now!
> 
> Dirk
> 
> | 
> | Thanks,
> | Simon
> | 
> | 
> | > On 17/05/2023, at 8:39 AM, Dirk Eddelbuettel  wrote:
> | > 
> | > 
> | > Simon:
> | > 
> | > On 17 May 2023 at 07:57, Simon Urbanek wrote:
> | > | builds are immediate, so it is a matter of seconds for most packages. I 
> don't see any issues on the Mac Builder server.
> | > | If you have a problem, please be more specific and include the check 
> link returned at submission.
> | > 
> | > I was talking about _CRAN uploads_. To be as specific as you asked:
> | > 
> | > - crc32c on CRAN since May 11, all systems apart from macOS built but all
> | >   macOS builds missing
> | >   https://cran.r-project.org/web/checks/check_results_crc32c.html
> | > 
> | > - RcppSimdJson on CRAN since May 14, six linux + windows builds made, two
> | >   linux builds and all macOS missing
> | >   https://cran.r-project.org/web/checks/check_results_RcppSimdJson.html
> | > 
> | > So no builds on macOS for either of my uploads to CRAN. Can you comment?
> | > 
> | > Dirk
> | > 
> | > | Cheers,
> | > | Simon
> | > |  
> | > | 
> | > | > On 17/05/2023, at 4:27 AM, Dirk Eddelbuettel  wrote:
> | > | > 
> | > | > 
> | > | > Simon,
> | > | > 
> | > | > As a follow-up to the cmake questions (and me now knowing I have to 
> tell R
> | > | > where cmake is on macOS), I uploaded a new package last Thursday. It 
> has long
> | > | > built everywhere on CRAN, but not on macOS.  Ditto for another 
> package update
> | > | > from Sunday (RcppSimdJson) which also has not been touched.
> | > | > 
> | > | > Should I adjust my expectations that this can take a week or longer 
> on macOS,
> | > | > or did a few builds fall off the wagon?
> | > | > 
> | > | > Thanks as always for looking after that architecture.
> | > | > 
> | > | > Best,  Dirk
> | > | > 
> | > | > -- 
> | > | > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> | > | > 
> | > | 
> | > 
> | > -- 
> | > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> | > 
> | 
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Problems with devtools::build() in R

2023-05-17 Thread Simon Urbanek
This thread went way off the rails and was cross-posted so the solution is on 
R-SIG-Mac.

It was simply wrong Fortran with wrong R - installing latest R and Fortran 
(from CRAN or https://mac.r-project.org/tools/) is the easiest way to solve the 
problem.

Note that R binaries and tools go together so if in doubt, just go to CRAN and 
follow the instructions.

Cheers,
Simon



> On 18/05/2023, at 3:41 AM, Ivan Krylov  wrote:
> 
> В Wed, 17 May 2023 11:05:46 -0400
> Jarrett Phillips  пишет:
> 
>> `which gfortran`  returns
>> 
>> /usr/local/bin/gfortran
> 
> I think you ran the other gfortran. Is there a gfortran installation in
> /opt/gfortran?
> 
>> libraries:
>> =/usr/local/gfortran/lib/gcc/aarch64-apple-darwin22/12.2.0/:/usr/local/gfortran/lib/gcc/aarch64-apple-darwin22/12.2.0/../../../../aarch64-apple-darwin22/lib/aarch64-apple-darwin22/12.2.0/:/usr/local/gfortran/lib/gcc/aarch64-apple-darwin22/12.2.0/../../../../aarch64-apple-darwin22/lib/:/usr/local/gfortran/lib/gcc/aarch64-apple-darwin22/12.2.0/../../../aarch64-apple-darwin22/12.2.0/:/usr/local/gfortran/lib/gcc/aarch64-apple-darwin22/12.2.0/../../../
> 
>> "/Library/Frameworks/R.framework/Resources/etc/Makeconf"
> 
> If you open this file, the flags
> -L/opt/R/arm64/gfortran/lib/gcc/aarch64-apple-darwin20.6.0/12.0.1
> -L/opt/R/arm64/gfortran/lib must be present in there somewhere. (Or
> maybe it's in ~/.R/Makevars, but you would've remembered creating it
> yourself.)
> 
> What if you replace the paths with the ones returned by gfortran,
> namely, -L/usr/local/gfortran/lib/gcc/aarch64-apple-darwin22/12.2.0
> -L/usr/local/gfortran/lib? (Even better, with the paths returned by
> /opt/gfortran/bin/gfortran -print-search-dirs, assuming this command
> works.) While you're at it, fix other Fortran-related paths like the
> path to the compiler. I still suspect you may end up having problems
> because your R was built with a different version of gfortran, but I
> don't know a better way of moving forward.
> 
> I'm going on general POSIX-like knowledge since I lack a Mac to test
> things on. Maybe R-SIG-Mac will have better advice for you.
> 
> -- 
> Best regards,
> Ivan
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Test fails on M1 Mac on CRAN, but not on macOS builder

2023-05-21 Thread Simon Urbanek
Florian,

looking at the notes for 2.1-4 it says the tolerance has the wrong sign, i.e. 
you're adding it to the value on both sides of the interval (instead of 
subtracting for the lower bound). In your latest version the tolerances get 
added everywhere so that makes even less sense to me, but then I don't know 
what you actually intended to be completely honest. All I say, simply make sure 
you get the logic for the tolerance intervals right.

Cheers,
Simon


> On 19/05/2023, at 9:49 PM, Pein, Florian  wrote:
> 
> Dear everyone,
> my R package stepR (https://cran.r-project.org/web/packages/stepR/) fails the 
> CRAN package checks on M1 Mac, but the error does not occur on the macOS 
> builder (https://mac.r-project.org/macbuilder/submit.html). So, I am unable 
> to reproduce the error and hence unable to fix it (starring at the code did 
> not help either).
> 
> The relevant part is
> 
> * checking tests ...
>  Running ĄĨtestthat.RĄĶ [35s/35s]
> [36s/36s] ERROR
> Running the tests in ĄĨtests/testthat.RĄĶ failed.
> 
>> test_check("stepR")
>  [ FAIL 1 | WARN 0 | SKIP 23 | PASS 22741 ]
> 
>   Failed tests 
> 
>  ĒwĒw Failure ('test-critVal.R:2463:3'): family 'hsmuce' works 
> ĒwĒwĒwĒwĒwĒwĒwĒwĒwĒwĒwĒwĒwĒwĒwĒwĒwĒwĒwĒw
>  compare <= as.integer(ncol(compareStat) * testalpha + tolerance) is not TRUE
> 
>  `actual`:   FALSE
>  `expected`: TRUE
>  sqrt
>  Backtrace:
>  Ēg
>   1. Ē|ĒwstepR (local) testVector(...) at test-critVal.R:2463:2
>   2.   Ē|Ēwtestthat::expect_true(...) at test-critVal.R:50:2
> 
>  [ FAIL 1 | WARN 0 | SKIP 23 | PASS 22741 ]
>  Error: Test failures
>  Execution halted
> 
> 
> Has anyone an idea how to tackle this problem?
> 
> The test code is long (a full version is available on CRAN). The following is 
> the code part that I think is relevant (once again I cannot reproduce the 
> error, so I am also unable to give a minimal reproducible example, I can only 
> guess one):
> 
> library(stepR)
> library(testthat)
> 
> testn <- 1024L
> teststat <- monteCarloSimulation(n = 1024L, r = 100L, family = "hsmuce") # 
> essentially a matrix with values generated by rnorm()
> testalpha <- 0.0567
> tolerance <- 1e-12
> 
> ret <- critVal(n = 1024L, penalty = "sqrt", output = "vector", family = 
> "hsmuce", alpha = testalpha, stat = teststat)
> 
> statVec <- as.numeric(teststat)
> tol <- min(min(diff(sort(statVec[is.finite(statVec)]))) / 2, 1e-12) # 
> different to the CRAN version to be robust to two values very close to each 
> other
> rejected <- matrix(FALSE, ncol(teststat), nrow(teststat))
> compare <- integer(ncol(teststat))
> 
> for (i in 1:nrow(teststat)) {
>  rejected[, i] <- teststat[i, ] > ret[i] + tol
> }
> 
> for (i in 1:ncol(teststat)) {
>  compare[i] <- max(rejected[i, ])
> }
> compare <- sum(compare)
> expect_true(compare <= as.integer(ncol(teststat) * testalpha + tolerance), 
> info = "sqrt")
> 
> # version with an additional tolerance (suggested when the test failed on 
> CRAN, but it does not help either)
> # both sides are small intgers, so it should not be needed
> expect_true(as.integer(compare + tolerance) <= as.integer(ncol(teststat) * 
> testalpha + tolerance) + 0.5, info = "sqrt")
> 
> 
> I am not sure how to approach this problem, so any suggestions are very much 
> welcomed.
> Many thanks and best wishes,
> Florian Pein
> (Lancaster university)
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] [External] Test fails on M1 Mac on CRAN, but not on macOS builder

2023-05-22 Thread Simon Urbanek
Florian,

ok, understood. It works for me on both M1 build machines, so can't really 
help. I'd simply submit the new version on CRAN. Of course it would help if the 
tests were more informative such as actually showing the values involved on 
failure so you could at least have an idea from the output.

Cheers
Simon



> On May 22, 2023, at 11:07 PM, Pein, Florian  wrote:
> 
> Dear Duncan and Simon,
> thank you both very much for you help.
> 
> I can make the test more informative and also break it down into substeps. 
> But I am unsure whether CRAN policies allow to use their system for such 
> testing steps. I rather think not. Though I must say that I still do not know 
> how to otherwise fix the error. Can anyone, ideally a CRAN maintainer, 
> confirm that this is okay?
> 
> Regarding the tolerance, the test compares to small integers. In this 
> specific situation both sides are 5L on my local system. The tolerance is 
> there to ensure that 
> ncol(compareStat) * testalpha is not something like 4.9 due to 
> floating point approximations and we end up with 4L when as.integer() is 
> applied. This is not happening in the concrete example, since 
> ncol(compareStat) * testalpha = 5.67. I am very sure that the error is on the 
> left hand side and compare is larger than 5. 
> 
> In fact, tol in
> rejected[, i] <- teststat[i, ] > ret[i] + tol
> may need to be larger, since teststat contains quite large values. I think 
> there is a realistic chance that a floating point error occurs at this point. 
> But once again, I do not want to send a random guess to CRAN when I cannot 
> test whether this has fixed the problem or not. I have tested the old code 
> with --disable-long-double and compiler flags such as -ffloat-store and 
> -fexcess-precision=standard and it works. I do not know to what degree this 
> ensures that it works on all systems.
> 
> Many thanks and best wishes,
> Florian
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] clang linker error: Symbol not found: _objc_msgSend$UTF8String

2023-06-17 Thread Simon Urbanek
Andreas,

that is actually not your problem - the stubs are generated in glib, so your 
package can do nothing about it, your compile flags won't change it. The only 
way to fix it is on my end, the proper way is to upgrade to Xcode 14 for the 
package builds, but that requires some changes to the build machine, so I'll do 
it on Monday when I'm at work, so hold on tight in the meantime.

Cheers,
Simon

Explanation of the issue for posterity: the issue is caused by Xcode 14 which 
generates those stubs[1], but can also handle them. However, older Xcode 
versions cannot. We are using macOS 11 target and SDK to ensure compatibility 
with older macOS versions, but apparently Xcode 14 assumes that the linking 
will still happen with Xcode 14 even if libraries are compiled for older 
targets. Therefore the proper fix is to make sure that packages are also linked 
with Xcode 14. Another work-around would be to compile glib with 
-fno-objc-msgsend-selector-stubs so it would also work with older Xcode, but 
it's more future-proof to just upgrade Xcode.

[1] https://github.com/llvm/llvm-project/issues/56034


> On Jun 17, 2023, at 7:07 PM, Andreas Blätte  
> wrote:
> 
> Dear colleagues,
> 
> 
> 
> after submitting a release of my package RcppCWB (no problems with test 
> servers), CRAN check results reported ERRORS on the macOS check systems: 
> https://cran.r-project.org/web/checks/check_results_RcppCWB.html
> 
> 
> 
> The core is that when test loading the package, you get the error: Symbol not 
> found: _objc_msgSend$UTF8String
> 
> 
> 
> Picking up a solution discussed here (disable objc_msgSend stubs in clang), I 
> modified the configure script of my package to pass the flag 
> “-fno-objc-msgsend-selector-stubs“ to the linker, which I thought would solve 
> the problem.
> 
> 
> 
> However: The CRAN Debian system for incoming R packages uses clang 15, which 
> does not accept this flag any more, resulting in an error.
> 
> 
> 
> Certainly, I could refine my configure script to address a very specific 
> scenario on CRAN macOS systems, i.e. making usage of the flag conditional on 
> a specific clang version. But I am not sure whether this is the way to go. It 
> would feel like a hack I would like to avoid.
> 
> 
> 
> Has anybody encountered this error? Is there a best practice or a recomended 
> solution? I would be very glad to get your advice!
> 
> 
> 
> Kind regards
> 
> Andreas
> 
> 
> 
> --
> 
> Prof. Dr. Andreas Blaette
> 
> Professor of Public Policy
> 
> University of Duisburg-Essen
> 
> 
> 
> 
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] clang linker error: Symbol not found: _objc_msgSend$UTF8String

2023-06-18 Thread Simon Urbanek


Andreas,

Xcode update fixed the issue as expected so in due time the ERRORs should 
disappear.

Cheers,
Simon


> On 18/06/2023, at 10:29 AM, Simon Urbanek  wrote:
> 
> Andreas,
> 
> that is actually not your problem - the stubs are generated in glib, so your 
> package can do nothing about it, your compile flags won't change it. The only 
> way to fix it is on my end, the proper way is to upgrade to Xcode 14 for the 
> package builds, but that requires some changes to the build machine, so I'll 
> do it on Monday when I'm at work, so hold on tight in the meantime.
> 
> Cheers,
> Simon
> 
> Explanation of the issue for posterity: the issue is caused by Xcode 14 which 
> generates those stubs[1], but can also handle them. However, older Xcode 
> versions cannot. We are using macOS 11 target and SDK to ensure compatibility 
> with older macOS versions, but apparently Xcode 14 assumes that the linking 
> will still happen with Xcode 14 even if libraries are compiled for older 
> targets. Therefore the proper fix is to make sure that packages are also 
> linked with Xcode 14. Another work-around would be to compile glib with 
> -fno-objc-msgsend-selector-stubs so it would also work with older Xcode, but 
> it's more future-proof to just upgrade Xcode.
> 
> [1] https://github.com/llvm/llvm-project/issues/56034
> 
> 
>> On Jun 17, 2023, at 7:07 PM, Andreas Blätte  
>> wrote:
>> 
>> Dear colleagues,
>> 
>> 
>> 
>> after submitting a release of my package RcppCWB (no problems with test 
>> servers), CRAN check results reported ERRORS on the macOS check systems: 
>> https://cran.r-project.org/web/checks/check_results_RcppCWB.html
>> 
>> 
>> 
>> The core is that when test loading the package, you get the error: Symbol 
>> not found: _objc_msgSend$UTF8String
>> 
>> 
>> 
>> Picking up a solution discussed here (disable objc_msgSend stubs in clang), 
>> I modified the configure script of my package to pass the flag 
>> “-fno-objc-msgsend-selector-stubs“ to the linker, which I thought would 
>> solve the problem.
>> 
>> 
>> 
>> However: The CRAN Debian system for incoming R packages uses clang 15, which 
>> does not accept this flag any more, resulting in an error.
>> 
>> 
>> 
>> Certainly, I could refine my configure script to address a very specific 
>> scenario on CRAN macOS systems, i.e. making usage of the flag conditional on 
>> a specific clang version. But I am not sure whether this is the way to go. 
>> It would feel like a hack I would like to avoid.
>> 
>> 
>> 
>> Has anybody encountered this error? Is there a best practice or a recomended 
>> solution? I would be very glad to get your advice!
>> 
>> 
>> 
>> Kind regards
>> 
>> Andreas
>> 
>> 
>> 
>> --
>> 
>> Prof. Dr. Andreas Blaette
>> 
>> Professor of Public Policy
>> 
>> University of Duisburg-Essen
>> 
>> 
>> 
>> 
>> 
>> 
>>  [[alternative HTML version deleted]]
>> 
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>> 
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Questions regarding a new (seperated package) and how to submit them to cran

2023-06-22 Thread Simon Urbanek
Bernd,

the sequence in which you submit doesn't matter - the packages have to work 
regardless of the sequence. Suggests means that the dependency is optional, not 
that it can break tests. You have to skip the tests that cannot be run due to 
missing dependencies (see 1.1.3.1 in R-exts)

Cheers,
Simon



> On Jun 23, 2023, at 2:35 PM, Bernd.Gruber  
> wrote:
> 
> Hi,
> 
> I have a question regarding the separation of a package into smaller pieces 
> (to avoid long testing/installation times and more important to avoid to many 
> dependencies)
> 
> I am the maintainer of an R package (dartR) which has grown and is now at the 
> limit in terms of testing/run time and also dependencies. To further develop 
> the package we started to break the package into smaller packages namely
> 
> 
> Two core packages (dartR.base and dartR.data) and here dartR.base has 
> dartR.data in the depends. (dartR.base is 60% of the previous package) and 
> dartR.data is our data.package for test data (dartR.data is already on CRAN)
> 
> 
> 
> 
> Next to the two core packages we also have 3 more addon packages that deal 
> with specialised analysis
> 
> dartR.sim
> dartR.spatial
> dartR.popgenomics.
> 
> Those packages depend on dartR.base and dartR.data.
> 
> All addon packages and core packages should have the other addon packages as 
> suggests, hence here comes the question.
> 
> 
> How do I submit the packages?  All of them at once? Or step by step.
> 
> If I submit step by step (e.g. dartR.base) it obviously cannot have the other 
> dartR addon packages as suggests (cannot be tested and will break the CRAN 
> tests).
> 
> So would be the correct way to:
> Submit dartR.base (without dartR.sim, dartR.spatial and dartR.popgenomics in 
> the suggest.)
> Then submit dartR.sim, then dartR.spatial and finally dartR.popgenomics (all 
> without suggests of the other packages)
> 
> And finally update all packages (only their description file and add the 
> suggests once they are on CRAN).
> 
> Hope that makes sense and thanks in advance,
> 
> Bernd
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] NOTE about missing package ‘emmeans’ on macos-x86_64

2023-06-23 Thread Simon Urbanek



> On Jun 24, 2023, at 12:19 AM, Uwe Ligges  
> wrote:
> 
> 
> 
> On 23.06.2023 11:27, Helmut Schütz wrote:
>> Dear all,
>> since a while (January?) we face NOTEs in package checks 
>> (https://cran.r-project.org/web/checks/check_results_PowerTOST.html):
>> Version: 1.5-4
>> Check: package dependencies
>> Result: NOTE
>> Package suggested but not available for checking: ‘emmeans’
>> Flavor: r-release-macos-x86_64
>> Version: 1.5-4
>> Check: Rd cross-references
>> Result: NOTE
>> Package unavailable to check Rd xrefs: ‘emmeans’
>> Flavor: r-release-macos-x86_64
>> First I thought that ‘emmeans’ is not available for macos-x86_64 on CRAN.
>> However, ‘emmeans’ itself passed all checks 
>> (https://cran.r-project.org/web/checks/check_results_emmeans.html).
>> Since we want to submit v1.5-5 of PowerTOST soon, any ideas?
> 
> Please go ahead. Simon rarely updates the check results, so I guess this was 
> a coincidence at the time and never got updated. I'd ignore this one.
> 

Correct, packages are only re-checked if they failed the check before. Once a 
package passes the checks the results are not re-run, because it would take way 
too long given how many packages we have (re-running all takes 2-3 days).

If you don't intend to update your package and want such NOTEs to disappear, 
send me an email and I can run it by hand (I did now for PowerTOST and the NOTE 
is gone).

Cheers,
Simon

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Convention or standards for using header library (e.g. Eigen)

2023-06-23 Thread Simon Urbanek
Stephen,

If you want to give the system version a shot, I would simply look for 
pkg-config, add the supplied CPPFLAGS to the package R flags if present and 
then test (regardless of pkg-config) with AC_CHECK_HEADER (see standard R-exts 
autoconf rules for packages). If that fails then use your included copy by 
adding the corresponding -I flag pointing to your supplied copy. You should not 
download anything as there is no expectation that the user has any internet 
access as the time of the installation so if you want to provide a fall-back, 
it should be in the sources of your package. That said, there is nothing wrong 
with ignoring the system version especially in this header-only case since you 
can then rely on the correct version which you tested - you can still allow the 
user to provide an option to override that behavior if desired. 

Cheers,
Simon



> On Jun 23, 2023, at 10:08 PM, Stephen Wade  wrote:
> 
> I recently submitted a package to CRAN which downloaded Eigen via Makevars
> and Makevars.win. My Makevars.ucrt was empty as I noted that Eigen3 is
> installed by default (however, this doesn't ensure that a version of Eigen
> compatible/tested with the package is available).
> 
> The source is currently on github:
> https://github.com/stephematician/literanger
> 
> Here is the Makevars
> 
> $ more src/Makevars
> # downloads eigen3 to extlibs/ and sets include location
> PKG_CPPFLAGS = -I../src -I../extlibs/
> .PHONY: all clean extlibs
> all: extlibs $(SHLIB)
> extlibs:
> "${R_HOME}/bin${R_ARCH_BIN}/Rscript" "../tools/extlibs.R"
> clean:
> rm -f $(SHLIB) $(OBJECTS)
> 
> The details of `extlibs.R` are fairly mundane, it downloads a release from
> gitlab and unzips it to `extlibs`.
> 
> CRAN gave me this feedback:
> 
>> Why do you download eigen here rather than using the system version of
>> Eigen if available?
>> 
>> We asked you to do that for Windows as you did in Makevars.ucrt. For
>> Unix-like OS you should only fall back (if at all) to some download if
>> the system Eigen is unavailable.
> 
> The problem is I'm not sure what a minimum standard to 'searching' for a
> system version of Eigen looks like. I also note that packages like
> RcppEigen simply bundle the Eigen headers within the package (and its
> repository) which will certainly ignore any system headers.
> 
> I would like a solution that would keep CRAN happy, i.e. i need to meet
> some standard for searching for the compiler flags, checking the version of
> the system headers, and then falling through to download release if the
> system headers fail.
> 
> 1.  For each platform (Unix, Windows, OS-X) what tool(s) should be invoked
> to check for compiler flags for a header-only library like Eigen? e.g.
> pkg-config, pkgconf? others?
> 2.  What is a reasonable approach for the possible package names for Eigen
> (e.g. typically libeigen3-dev on Debian, and eigen3 on arch, homebrew,
> others)? Is this enough?
> 3.  If pkg-config/pkgconf (or others) are unavailable, what is a reasonable
> standard for checking if the library can be built with some reasonable
> guess for the compiler flags (probably empty) - I assume I would need to
> try to compile a test program (within Makevars)?
> 4.  Following on from 3... would a package need to check (again via a test
> program) that the _system_ headers have the correct version (e.g. some
> static assert on EIGEN_WORLD_VERSION), and if that fails _then_ download
> the release from gitlab?
> 
> Any and all advice would be appreciated.
> 
> Kind regards,
> -Stephen Wade
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] gfortran: command not found

2023-07-05 Thread Simon Urbanek
To quote from the page you downloaded R from:

This release uses Xcode 14.2/14.3 and GNU Fortran 12.2. If you wish to compile 
R packages which contain Fortran code, you may need to download the 
corresponding GNU Fortran compiler from https://mac.R-project.org/tools. 



> On Jul 6, 2023, at 11:50 AM, Spencer Graves 
>  wrote:
> 
> Hello:
> 
> 
> "R CMD build KFAS" under macOS 11.7.8 stopped with:
> 
> 
> using C compiler: ‘Apple clang version 12.0.5 (clang-1205.0.22.9)’
> sh: gfortran: command not found
> using SDK: ‘MacOSX11.3.sdk’
> gfortran -arch x86_64  -fPIC  -Wall -g -O2  -c  approx.f90 -o approx.o
> make: gfortran: No such file or directory
> make: *** [approx.o] Error 1
> ERROR: compilation failed for package ‘KFAS'
> 
> 
> My web search suggests several different ways to fix this problem, 
> but I don't know which to try.
> 
> 
> 
> Suggestions?
> Thanks,
> Spencer Graves
> 
> 
> p.s.  I have both "brew" and "port" installed.  I recently used "port" to 
> upgrade another software package.  A web search suggested the following:
> 
> 
> sudo port install gcc48
> sudo port select -set gcc mp-gcc48
> 
> 
> However, this comment was posted roughly 9 years ago.  Below please 
> find sessionInfo().
> 
> 
> sessionInfo()
> R version 4.3.1 (2023-06-16)
> Platform: x86_64-apple-darwin20 (64-bit)
> Running under: macOS Big Sur 11.7.8
> 
> Matrix products: default
> BLAS: 
> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
>  
> LAPACK: 
> /Library/Frameworks/R.framework/Versions/4.3-x86_64/Resources/lib/libRlapack.dylib;
>   LAPACK version 3.11.0
> 
> locale:
> [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
> 
> time zone: America/Chicago
> tzcode source: internal
> 
> attached base packages:
> [1] stats graphics  grDevices utils datasets  methods   base
> 
> loaded via a namespace (and not attached):
> [1] compiler_4.3.1  R6_2.5.1magrittr_2.0.3  cli_3.6.1
> [5] tools_4.3.1 glue_1.6.2  rstudioapi_0.14 roxygen2_7.2.3
> [9] xml2_1.3.4  vctrs_0.6.2 stringi_1.7.12  knitr_1.42
> [13] xfun_0.39   stringr_1.5.0   lifecycle_1.0.3 rlang_1.1.1
> [17] purrr_1.0.1
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Best practices for CRAN package using Go

2023-07-12 Thread Simon Urbanek
Dewey,

you will definitely need to include all the necessary sources for your package. 
You may want to have a look at the "Using Rust"[1] document linked from the 
CRAN policy. I think Go is quite similar to Rust in that sense so you should 
use the same approach, i.e. checking for system and user installations (for go 
the official location is /usr/local/go/bin/go and it may not be on the PATH), 
declaring Go version dependency and making sure your package has included all 
module dependency sources (i.e. don't use install-time module 
resolution/download).

If you need to include a large source tar ball that is not permissible on CRAN, 
I'd recommend using Zenodo.org since it is specifically designed to facilitate 
longevity and reproducibility (as opposed to Github or other transient storage 
that may disappear at any point).

All that said, you may run into the same issues as Rust (errors and segfaults 
due to limited interoperability of compilers) so use with care and test well. 
External bindings like Rust or Go are only provided on "best effort" basis.

Cheers,
Simon

[1] - https://cran.r-project.org/web/packages/using_rust.html

PS: go is now available on the CRAN macOS builder machines and the Mac Builder 
(https://mac.r-project.org/macbuilder/submit.html).


> On 13/07/2023, at 2:36 AM, Dewey Dunnington  wrote:
> 
> Thank you! It seems I needed the refresher on CRAN policy regarding 
> downloading sources: it seems like the go.sum/go.mod provide sufficient 
> checksumming to comply with the policy, as you noted (with `go mod vendor` as 
> a backup if this turns out to not be acceptable). Downloading Go is probably 
> out based on the advice for Rust that explicitly forbids this.
> 
> Cheers!
> 
> -dewey
> 
> On 2023-07-10 11:09, Ivan Krylov wrote:
>> В Thu, 06 Jul 2023 15:22:26 -0300
>> Dewey Dunnington  пишет:
>>> I've wrapped two of these drivers for R that seem to build and
>>> install on MacOS, Linux, and Windows [3][4]; however, I am not sure
>>> if the pattern I used is suitable for CRAN or whether these packages
>>> will have to be GitHub-only for the foreseeable future.
>> There are a few parts to following the CRAN policy [*] regarding
>> external dependencies.
>> I think (but don't know for sure) that your package will not be allowed
>> to download Go by itself. The policy says: "Only as a last resort and
>> with the agreement of the CRAN team should a package download
>> pre-compiled software."
>> An already installed Go should be able to "first look to see if [a
>> dependency] is already installed and if so is of a suitable version"
>> when installing the dependencies of the Go part of the code. The go.mod
>> and go.sum files specify the exact versions and checksums of the
>> dependencies, which satisfies the requirement for fixed dependency
>> versions ("it is acceptable to download them as part of installation,
>> but do ensure that the download is of a fixed version rather than the
>> latest"), so your package seems to be fine in this respect.
>> One more thing: when bootstrapping the source package, can you run go
>> mod vendor [**] in order to bundle *all* the Go dependencies together
>> with the package? Is the resulting directory prohibitively large? Would
>> it satisfy the CRAN policy preference to "include the library sources
>> in the package and compile them as part of package installation"
>> without requiring Internet access? Unfortunately, I don't know enough
>> about Go to answer these questions myself. I think that a small bundle
>> of vendored Go code would be preferrable for CRAN but *not* preferrable
>> for packaging in a GNU/Linux distro like Debian where dynamic linking
>> (in the widest possible sense) is a strong preference.
>> --
>> Best regards,
>> Ivan
>> [*] https://cran.r-project.org/web/packages/policies.html
>> [**] https://go.dev/ref/mod#vendoring
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Feedback on "Using Rust in CRAN packages"

2023-07-12 Thread Simon Urbanek
Yutani,

I'm not quite sure your reading fully matches the intent of the policy. 
Cargo.lock is not sufficient, it is expected that the package will provide 
*all* the sources, it is not expected to use cargo to resolve them from random 
(possibly inaccessible) places. So the package author is expected to either 
include the sources in the package *or* (if prohibitive due to extreme size) 
have a release tar ball available at a fixed, secure, reliable location (I was 
recommending Zenodo.org for that reason - GitHub is neither fixed nor reliable 
by definition).

Based on that, I'm not sure I fully understand the scope of your proposal for 
improvement. Carlo.lock is certainly the first step that the package author 
should take in creating the distribution tar ball so you can fix the versions, 
but it is not sufficient as the next step involves collecting the related 
sources. We don't want R users to be involved in that can of worms (especially 
since the lock file itself provides no guarantees of accessibility of the 
components and we don't want to have to manually inspect it), the package 
should be ready to be used which is why it has to do that step first. Does that 
explain the intent better? (In general, the downloading at install time is 
actually a problem, because it's not uncommon to use R in environments that 
have no Internet access, but the download is a concession for extreme cases 
where the tar balls may be too big to make it part of the package, but it's yet 
another can of worms...).

Cheers,
Simon



> On 13/07/2023, at 12:37 PM, Hiroaki Yutani  wrote:
> 
> Hi,
> 
> I'm glad to see CRAN now has its official policy about Rust [1]!
> It seems it probably needs some feedback from those who are familiar with
> the Rust workflow. I'm not an expert, but let me leave some quick feedback.
> This email is sent to the R-package-devel mailing list as well as to cran@~
> so that we can publicly discuss.
> 
> It seems most of the concern is about how to make the build deterministic.
> In this regard, the policy should encourage including "Cargo.lock" file
> [2]. Cargo.lock is created on the first compile, and the resolved versions
> of dependencies are recorded. As long as this file exists, the dependency
> versions are locked to the ones in this file, except when the package
> author explicitly updates the versions.
> 
> Cargo.lock also records the SHA256 checksums of the crates if they are from
> crates.io, Rust's official crate registry. If the checksums don't match,
> the build will fail with the following message:
> 
>error: checksum for `foo v0.1.2` changed between lock files
> 
>this could be indicative of a few possible errors:
> 
>* the lock file is corrupt
>* a replacement source in use (e.g., a mirror) returned a different
> checksum
>* the source itself may be corrupt in one way or another
> 
>unable to verify that `foo v0.1.2` is the same as when the lockfile was
> generated
> 
> For dependencies from Git repositories, Cargo.lock records the commit
> hashes. So, the version of the source code (not the version of the crate)
> is uniquely determined. That said, unlike cargo.io, it's possible that the
> commit or the Git repository itself has disappeared at the time of
> building, which makes the build fail. So, it might be reasonable the CRAN
> policy prohibits the use of Git dependency unless the source code is
> bundled. I have no strong opinion here.
> 
> Accordingly, I believe this sentence
> 
>> In practice maintainers have found it nigh-impossible to meet these
> conditions whilst downloading as they have too little control.
> 
> is not quite true. More specifically, these things
> 
>> The standard way to download a Rust ‘crate’ is by its version number, and
> these have been changed without changing their number.
>> Downloading a ‘crate’ normally entails downloading its dependencies, and
> that is done without fixing their version numbers
> 
> won't happen if the R package does include Cargo.lock because
> 
> - if the crate is from crates.io, "the version can never be overwritten,
> and the code cannot be deleted" there [3]
> - if the crate is from a Git repository, the commit hash is unique in its
> nature. The version of the crate might be the same between commits, but a
> git dependency is specified by the commit hash, not the version of the
> crate.
> 
> I'm keen to know what problems the CRAN maintainers have experienced that
> Cargo.lock cannot solve. I hope we can help somehow to improve the policy.
> 
> Best,
> Yutani
> 
> [1]: https://cran.r-project.org/web/packages/using_rust.html
> [2]: https://doc.rust-lang.org/cargo/guide/cargo-toml-vs-cargo-lock.html
> [3]: https://doc.rust-lang.org/cargo/reference/publishing.html
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 


Re: [R-pkg-devel] Feedback on "Using Rust in CRAN packages"

2023-07-12 Thread Simon Urbanek



> On 13/07/2023, at 2:50 PM, Kevin Ushey  wrote:
> 
> Package authors could use 'cargo vendor' to include Rust crate sources 
> directly in their source R packages. Would that be acceptable?
> 


Yes, that is exactly what was suggested in the original thread.

Cheers,
Simon



> Presumedly, the vendored sources would be built using the versions specified 
> in an accompanying Cargo.lock as well.
> 
> https://doc.rust-lang.org/cargo/commands/cargo-vendor.html
> 
> 
> On Wed, Jul 12, 2023, 7:35 PM Simon Urbanek  
> wrote:
> Yutani,
> 
> I'm not quite sure your reading fully matches the intent of the policy. 
> Cargo.lock is not sufficient, it is expected that the package will provide 
> *all* the sources, it is not expected to use cargo to resolve them from 
> random (possibly inaccessible) places. So the package author is expected to 
> either include the sources in the package *or* (if prohibitive due to extreme 
> size) have a release tar ball available at a fixed, secure, reliable location 
> (I was recommending Zenodo.org for that reason - GitHub is neither fixed nor 
> reliable by definition).
> 
> Based on that, I'm not sure I fully understand the scope of your proposal for 
> improvement. Carlo.lock is certainly the first step that the package author 
> should take in creating the distribution tar ball so you can fix the 
> versions, but it is not sufficient as the next step involves collecting the 
> related sources. We don't want R users to be involved in that can of worms 
> (especially since the lock file itself provides no guarantees of 
> accessibility of the components and we don't want to have to manually inspect 
> it), the package should be ready to be used which is why it has to do that 
> step first. Does that explain the intent better? (In general, the downloading 
> at install time is actually a problem, because it's not uncommon to use R in 
> environments that have no Internet access, but the download is a concession 
> for extreme cases where the tar balls may be too big to make it part of the 
> package, but it's yet another can of worms...).
> 
> Cheers,
> Simon
> 
> 
> 
> > On 13/07/2023, at 12:37 PM, Hiroaki Yutani  wrote:
> > 
> > Hi,
> > 
> > I'm glad to see CRAN now has its official policy about Rust [1]!
> > It seems it probably needs some feedback from those who are familiar with
> > the Rust workflow. I'm not an expert, but let me leave some quick feedback.
> > This email is sent to the R-package-devel mailing list as well as to cran@~
> > so that we can publicly discuss.
> > 
> > It seems most of the concern is about how to make the build deterministic.
> > In this regard, the policy should encourage including "Cargo.lock" file
> > [2]. Cargo.lock is created on the first compile, and the resolved versions
> > of dependencies are recorded. As long as this file exists, the dependency
> > versions are locked to the ones in this file, except when the package
> > author explicitly updates the versions.
> > 
> > Cargo.lock also records the SHA256 checksums of the crates if they are from
> > crates.io, Rust's official crate registry. If the checksums don't match,
> > the build will fail with the following message:
> > 
> >error: checksum for `foo v0.1.2` changed between lock files
> > 
> >this could be indicative of a few possible errors:
> > 
> >* the lock file is corrupt
> >* a replacement source in use (e.g., a mirror) returned a different
> > checksum
> >* the source itself may be corrupt in one way or another
> > 
> >unable to verify that `foo v0.1.2` is the same as when the lockfile was
> > generated
> > 
> > For dependencies from Git repositories, Cargo.lock records the commit
> > hashes. So, the version of the source code (not the version of the crate)
> > is uniquely determined. That said, unlike cargo.io, it's possible that the
> > commit or the Git repository itself has disappeared at the time of
> > building, which makes the build fail. So, it might be reasonable the CRAN
> > policy prohibits the use of Git dependency unless the source code is
> > bundled. I have no strong opinion here.
> > 
> > Accordingly, I believe this sentence
> > 
> >> In practice maintainers have found it nigh-impossible to meet these
> > conditions whilst downloading as they have too little control.
> > 
> > is not quite true. More specifically, these things
> > 
> >> The standard way to download a Rust ‘crate’ is by its version number, and
> > these ha

Re: [R-pkg-devel] Feedback on "Using Rust in CRAN packages"

2023-07-13 Thread Simon Urbanek
Yutani,

[moving back to the original thread, please don't cross-post]


> On Jul 13, 2023, at 3:34 PM, Hiroaki Yutani  wrote:
> 
> Hi Simon,
> 
> Thanks for the response. I thought
> 
>> download a specific version from a secure site and check that the
> download is the expected code by some sort of checksum
> 
> refers to the usual process that's done by Cargo automatically. If it's
> not, I think the policy should have a clear explanation. It seems it's not
> only me who wondered why this policy doesn't mention Cargo.lock at all.
> 


as explained. The instructions will be updated to make it clear that "cargo 
vendor" is the right tool here.


>> it is not expected to use cargo to resolve them from random (possibly
> inaccessible) places
> 
> Yes, I agree with you. So, I suggested the possibility of forbidding the Git 
> dependency. Or, do you call crates.io, Rust's official repository, "random 
> places"?


No, as I understand it, the lock file can have arbitrary URLs, that's what I 
was referring to.


> If CRAN cannot trust even the official one of Rust, why does CRAN have Rust 
> at all?
> 


I don't see the connection - if you downloaded something in the past it doesn't 
mean you will be able to do so in the future. And CRAN has Rust because it 
sounded like a good idea to allow packages to use it, but I can see that it 
opened a can of worms that we trying to tame here.


> That said, I agree with your concern about downloading via the Internet in
> general. Downloading is one of the common sources of failure. If you want
> to prevent cargo from downloading any source files, you can enforce adding
> --offline option to "cargo build". While the package author might feel
> unhappy, I think this would make your intent a bit clearer.
> 


I'm not a cargo expert, but I thought cargo build --offline is not needed if 
the dependencies are already vendored? If you think cargo users need more help 
with the steps, then feel free to propose what the instructions should say (we 
really assume that the authors know what they are doing).

Cheers,
Simon

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Feedback on "Using Rust in CRAN packages"

2023-07-13 Thread Simon Urbanek



> On Jul 14, 2023, at 11:19 AM, Hadley Wickham  wrote:
> 
>>> If CRAN cannot trust even the official one of Rust, why does CRAN have Rust 
>>> at all?
>>> 
>> 
>> I don't see the connection - if you downloaded something in the past it 
>> doesn't mean you will be able to do so in the future. And CRAN has Rust 
>> because it sounded like a good idea to allow packages to use it, but I can 
>> see that it opened a can of worms that we trying to tame here.
> 
> Can you give a bit more detail about your concerns here? Obviously
> crates.io isn't some random site on the internet, it's the official
> repository of the Rust language, supported by the corresponding
> foundation for the language. To me that makes it feel very much like
> CRAN, where we can assume if you downloaded something in the past, you
> can download something in the future.
> 

I was just responding to Yutani's question why we downloaded the Rust compilers 
on CRAN at all. This has really nothing to do with the previous discussion 
which is why I did say "I don't see the connection". Also I wasn't talking 
about crates.io anywhere in my responses in this thread. The only thing I 
wanted to discuss here was that I think the existing Rust model  ("vendor" into 
the package sources) seems like a good one to apply to Go, but that got somehow 
hijacked...

Cheers,
Simon

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] macOS results not mirrored/updated at CRAN

2023-07-15 Thread Simon Urbanek
I looked into it and there was no issue on the build machine or staging server, 
so it will require some more digging in the international waters .. hopefully 
sometime next week…

Cheers,
Simon


> On 16/07/2023, at 11:25, Dirk Eddelbuettel  wrote:
> 
> 
> Simon,
> 
> On 12 July 2023 at 19:02, Dirk Eddelbuettel wrote:
> | 
> | Simon,
> | 
> | It looks like some result mirroring / pushing from your machines to CRAN 
> fell
> | over.  One of my packages, digest 0.6.33, arrived on CRAN about a week ago,
> | is built almost everywhere (apart from macOS_release_x86_64 stuck at 0.6.32)
> | but the result page still has nags from the 0.6.31 build for macOS release
> | and one of the oldrel builds.
> | 
> | Could you look into that?  And if it is "just" general issue at CRAN as per
> | Uwe's email earlier I will happily wait.  But it has been in this frozen /
> | partial update of results state for a few days now.
> 
> digest on macOS x86_64 is still stuck at 0.6.31 results on the summary
> displaying a now very stale and factually incorrect NOTE.
> 
> Can you please look into this?
> 
> Thanks,  Dirk
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] macOS results not mirrored/updated at CRAN

2023-07-22 Thread Simon Urbanek
Drik,

thanks. I have tried to address the problem and the actual sync problem for 
big-sur-x86_64 was fixed (as you can the see the results have been updated 
after you reported it), but apparently there was another, independent, problem 
with the cron jobs on that machine. I have changed the way the results sync is 
triggered, so hopefully that will make it more reliable.

Cheers,
Simon


> On Jul 23, 2023, at 12:26 AM, Dirk Eddelbuettel  wrote:
> 
> 
> Simon,
> 
> This still persists. As Murray reported, it happened for a while now, it is
> still happening eg package tiledb has been rebuilt everywhere [1] since the
> upload a few days ago -- yet the results page still reports builds two
> uploads ago [2] for both arm64 variants of your macOS setup.
> 
> Can you take a look, please?
> 
> Thanks in advance,  Dirk
> 
> 
> [1] https://cran.r-project.org/package=tiledb
> [2] https://cran.r-project.org/web/checks/check_results_tiledb.html
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] macOS results not mirrored/updated at CRAN

2023-08-10 Thread Simon Urbanek
Dirk,

thanks - one of those annoying cases where a script works in the login shell, 
but not in the cron job -- hopefully fixed.

Cheers,
Simon


> On 9/08/2023, at 12:45 AM, Dirk Eddelbuettel  wrote:
> 
> 
> Simon,
> 
> This is still an issue for arm64.  Uploaded tiledb and RQuantLib yesterday,
> both already built binaries for macOS (thank you!) but on the x86_64 ones are
> on the results page.  Can you take another peek at this?
> 
> Thanks so much,  Dirk
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] gdb availability on r-devel-linux-x86_64-fedora-gcc

2023-08-13 Thread Simon Urbanek



> On 14/08/2023, at 5:25 AM, Jamie Lentin  wrote:
> 
> Thanks both!
> 
> On 2023-08-12 23:52, Uwe Ligges wrote:
>> On 12.08.2023 23:19, Dirk Eddelbuettel wrote:
>>> On 12 August 2023 at 18:12, Uwe Ligges wrote:
>>> | On 12.08.2023 15:10, Jamie Lentin wrote:
>>> | > The system call in question is done by the TMB package[2], and not ours
>>> | > to tinker with:
>>> | >
>>> | >cmd <- paste("R --vanilla < ",file," -d gdb --debugger-args=\"-x",
>>> | > gdbscript,"\"")
>>> | >txt <- system(cmd,intern=TRUE,ignore.stdout=FALSE,ignore.stderr=TRUE)
>>> | >
>>> | > My only vaguely reasonable guess is that gdb isn't available on the host
>>> | > in question (certainly R will be!). How likely is this? Is it worth
>>> | > trying to resubmit with the call wrapped with an "if (gdb is on the 
>>> path)"?
>>> |
>>> | I guess it is really not available as that system got an update.
>>> | Note that you package does not declare any SystemRequirements. Please do
>>> | so and mention gdb.
> 
> It's TMB::gdbsource() that's calling system("R -d gdb"), so presumably the 
> SystemRequirements should live there rather than gadget3? I can raise an 
> issue suggesting this.
> 
>>> | Wrapping it in "if (gdb is on the path)" seems a good solution.
>>> Seconded esp as some systems may have lldb instead of gdb, or neither.
>>> Adding a simple `if (nzchar(Sys.which("gdb")))` should get you there.
>>> Dirk
>> Note that also
>> 1. The machine does not have R on the path (but Rdev)
> 
> Okay, I'll check for "all(nzchar(Sys.which(c('gdb', 'R'". This is 
> overkill somewhat, and the example won't run in some environments that 
> TMB::gdbsource() should work in. However, at least it'll check it does work 
> for the relatively default case.
> 

Please note that it should not be calling some random program called R - it 
should be calling the R instance it's running in (as Uwe pointed out there may 
be several) so possibly something like file.path(R.home(),"bin","R")

Cheers,
Simon


>> 2. you need to use a current pandoc. Citing Professor Ripley: "The
>> platforms failing are using pandoc 3.1.6 or (newly updated, M1mac)
>> 3.1.6.1"
> 
> I'll be sure to try upgrading before resubmitting.
> 
> Thanks again for your help!
> 
>> Best,
>> Uwe Ligges
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Re-building vignettes had CPU time 9.2 times elapsed time

2023-08-25 Thread Simon Urbanek



> On Aug 26, 2023, at 11:01 AM, Dirk Eddelbuettel  wrote:
> 
> 
> On 25 August 2023 at 18:45, Duncan Murdoch wrote:
> | The real problem is that there are two stubborn groups opposing each 
> | other:  the data.table developers and the CRAN maintainers.  The former 
> | think users should by default dedicate their whole machine to 
> | data.table.  The latter think users should opt in to do that.
> 
> No, it feels more like it is CRAN versus the rest of the world.
> 


In reality it's more people running R on their laptops vs the rest of the 
world. Although people with laptops are the vast majority, they also are the 
least impacted by the decision going either way. I think Jeff summed up the 
core reasoning pretty well. Harm is done by excessive use, not other other way 
around.

That said, I think this thread is really missing the key point: there is no 
central mechanism that would govern the use of CPU resources. OMP_THREAD_LIMIT 
is just one of may ways and even that is vastly insufficient for reasons 
discussed (e.g, recursive use of processes). It is not CRAN's responsibility to 
figure out for each package what it needs to behave sanely - it has no way of 
knowing what type of parallelism is used, under which circumstances and how to 
control it. Only the package author knows that (hopefully), which is why it's 
on them. So instead of complaining here better use of time would be to look at 
what's being used in packages and come up with a unified approach to monitoring 
core usage and a mechanism by which the packages could self-govern to respect 
the desired limits. If there was one canonical place, it would be also easy for 
users to opt in/out as they desire - and I'd be happy to help if any components 
of it need to be in core R.



> Take but one example, and as I may have mentioned elsewhere, my day job 
> consists in providing software so that (to take one recent example) 
> bioinformatics specialist can slice huge amounts of genomics data.  When that 
> happens on a dedicated (expensive) hardware with dozens of cores, it would be 
> wasteful to have an unconditional default of two threads. It would be the end 
> of R among serious people, no more, no less. Can you imagine how the internet 
> headlines would go: "R defaults to two threads". 
> 

If you run on such a machine then you or your admin certainly know how to set 
the desired limits. From experience the problem is exactly the opposite - it's 
far more common for users to not know how to not overload such a machine. As 
for internet headlines, they will always be saying blatantly false things like 
"R is not for large data" even though we have been using it to analyze 
terabytes of data per minute ...

Cheers,
Simon



> And it is not just data.table as even in the long thread over in its repo we 
> have people chiming in using OpenMP in their code (as data.table does but 
> which needs a different setter than the data.table thread count).
> 
> It is the CRAN servers which (rightly !!) want to impose constraints for when 
> packages are tested.  Nobody objects to that.
> 
> But some of us wonder if settings these defaults for all R user, all the 
> time, unconditional is really the right thing to do.  Anyway, Uwe told me he 
> will take it to an internal discussion, so let's hope sanity prevails.
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] What to do when a package is archived from CRAN

2023-08-26 Thread Simon Urbanek
Tatsuya,

What you do is contact CRAN. I don't think anyone here can answer your 
question, only CRAN can, so ask there.

Generally, packages with sufficiently many Rust dependencies have to be handled 
manually as they break the size limit, so auto-rejections are normal. Archival 
is unusual, but it may have fallen through the cracks - but the way to find out 
is to ask.

One related issue with respect to CRAN policies that I don't see a good 
solution for is that inst/AUTHORS is patently unhelpful, because most of them 
say "foo (version ..): foo authors" with no contact, or real names or any 
links. That seems to be a problem stemming from the Rust community as there 
doesn't seem to be any accountability with respect to ownership and 
attribution. I don't know if it's because it's assumed that GitHub history is 
the canonical source with the provenance, but that gets lost when pulled into 
the package.

Cheers,
Simon

PS: Your README says "(Rust 1.65 or later)", but the version condition is 
missing from SystemRequirements.


> On Aug 26, 2023, at 2:46 PM, SHIMA Tatsuya  wrote:
> 
> Hi,
> 
> I noticed that my submitted package `prqlr` 0.5.0 was archived from CRAN on 
> 2023-08-19.
> 
> 
> I submitted prqlr 0.5.0 on 2023-08-13. I believe I have since only received 
> word from CRAN that it passed the automated release process. 
> 
> So I was very surprised to find out after I returned from my trip that this 
> was archived.
> 
> The CRAN page says "Archived on 2023-08-19 for policy violation. " but I 
> don't know what exactly was the problem.
> I have no idea what more to fix as I believe I have solved all the problems 
> when I submitted 0.5.0.
> 
> Is there any way to know what exactly was the problem?
> (I thought I sent an e-mail to CRAN 5 days ago but have not yet received an 
> answer, so I decided to ask my question on this mailing list, thinking that 
> there is a possibility that there will be no answer to my e-mail, although I 
> may have to wait a few weeks for an answer. My apologies if this idea is 
> incorrect.)
> 
> Best,
> Tatsuya
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] What to do when a package is archived from CRAN

2023-08-26 Thread Simon Urbanek
Yutani,


> On Aug 27, 2023, at 2:19 PM, Hiroaki Yutani  wrote:
> 
> Simon,
> 
> > it's assumed that GitHub history is the canonical source with the 
> > provenance, but that gets lost when pulled into the package.
> 
> No, not GitHub. You can usually find the ownership on crates.io 
> <http://crates.io/>. So, if you want a target to blame, it's probably just a 
> problem of the script to auto-generate inst/AUTHORS in this specific case. 
> But, clearly, Rust's ecosystem works soundly under the existence of crates.io 
> <http://crates.io/>, so I think this is the same kind of pain which you would 
> feel if you use R without CRAN.
> 

Can you elaborate? I have not found anything that would have a list of authors 
in the sources. I fully agree that I know nothing about it, but even if you use 
R without CRAN, each package contains that information in the DESCRIPTION file 
since it's so crucial. So are you saying you have to use crates.io and do some 
extra step during the (misnamed) "vendor" step? (I didn't see the submitted tar 
ball of plqrl and its release on GitHub is not the actual package so can't 
check - thus just trying reverse-engineer what happens by looking at the 
dependencies which leads to GitHub).


> Sorry for nitpicking.
> 

Sure, good to get the fact straight.

Cheers,
Simon



> Best,
> Yutani
> 
> 2023年8月27日(日) 6:57 Simon Urbanek  <mailto:simon.urba...@r-project.org>>:
> Tatsuya,
> 
> What you do is contact CRAN. I don't think anyone here can answer your 
> question, only CRAN can, so ask there.
> 
> Generally, packages with sufficiently many Rust dependencies have to be 
> handled manually as they break the size limit, so auto-rejections are normal. 
> Archival is unusual, but it may have fallen through the cracks - but the way 
> to find out is to ask.
> 
> One related issue with respect to CRAN policies that I don't see a good 
> solution for is that inst/AUTHORS is patently unhelpful, because most of them 
> say "foo (version ..): foo authors" with no contact, or real names or any 
> links. That seems to be a problem stemming from the Rust community as there 
> doesn't seem to be any accountability with respect to ownership and 
> attribution. I don't know if it's because it's assumed that GitHub history is 
> the canonical source with the provenance, but that gets lost when pulled into 
> the package.
> 
> Cheers,
> Simon
> 
> PS: Your README says "(Rust 1.65 or later)", but the version condition is 
> missing from SystemRequirements.
> 
> 
> > On Aug 26, 2023, at 2:46 PM, SHIMA Tatsuya  > <mailto:ts1s1a...@gmail.com>> wrote:
> > 
> > Hi,
> > 
> > I noticed that my submitted package `prqlr` 0.5.0 was archived from CRAN on 
> > 2023-08-19.
> > <https://CRAN.R-project.org/package=prqlr 
> > <https://cran.r-project.org/package=prqlr>>
> > 
> > I submitted prqlr 0.5.0 on 2023-08-13. I believe I have since only received 
> > word from CRAN that it passed the automated release process. 
> > <https://github.com/eitsupi/prqlr/pull/161 
> > <https://github.com/eitsupi/prqlr/pull/161>>
> > So I was very surprised to find out after I returned from my trip that this 
> > was archived.
> > 
> > The CRAN page says "Archived on 2023-08-19 for policy violation. " but I 
> > don't know what exactly was the problem.
> > I have no idea what more to fix as I believe I have solved all the problems 
> > when I submitted 0.5.0.
> > 
> > Is there any way to know what exactly was the problem?
> > (I thought I sent an e-mail to CRAN 5 days ago but have not yet received an 
> > answer, so I decided to ask my question on this mailing list, thinking that 
> > there is a possibility that there will be no answer to my e-mail, although 
> > I may have to wait a few weeks for an answer. My apologies if this idea is 
> > incorrect.)
> > 
> > Best,
> > Tatsuya
> > 
> > __
> > R-package-devel@r-project.org <mailto:R-package-devel@r-project.org> 
> > mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-package-devel 
> > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> > 
> 
> __
> R-package-devel@r-project.org <mailto:R-package-devel@r-project.org> mailing 
> list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel 
> <https://stat.ethz.ch/mailman/listinfo/r-package-devel>


[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] published packages not showing updated macOS results

2023-09-13 Thread Simon Urbanek
Aron,

one package managed to spawn a separate process that was blocking the build 
process (long story) and I was on the other side of the world. It should be 
fixed now, but it may take up to a day before the backlog is processed. In the 
future for faster response, please contact me directly - see "CRAN Binary 
Package Maintenance" on the CRAN Team webpage.

Cheers,
Simon


> On Sep 13, 2023, at 2:10 AM, Aron Atkins  wrote:
> 
> Hi.
> 
> It looks like macOS package publishing may have stalled. One of my
> packages, rsconnect 1.1.0, arrived on CRAN about a week ago. It is built
> almost everywhere, but r-release-macos* and r-release-macos-arm64 are still
> showing results from the previous release.
> 
> Recent releases did see check failures on the r-release-macos-x86_64 host
> (because its Pandoc installation was older than supported by one of our
> dependencies). The rsconnect 1.1.0 release should address these failures,
> but I am still waiting to see builds flow through.
> 
> https://cran.r-project.org/web/packages/rsconnect/index.html
> https://cran.r-project.org/web/checks/check_results_rsconnect.html
> 
> Is someone able to look into this or otherwise offer advice?
> 
> Thanks,
> Aron
> -- 
> email: aron.atk...@gmail.com
> home: http://gweep.net/~aron/
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R_orderVector1 - algo: radix, shell, or another?

2023-09-24 Thread Simon Urbanek
I think the logic Jeff had in mind is that R order() uses C do_order() for 
method="shell" and since do_order() uses orderVector1() by induction it is the 
shell-sort implementation.

order() itself uses whatever you specify in method=.

Cheers,
Simon


> On Sep 25, 2023, at 7:10 AM, Jan Gorecki  wrote:
> 
> Hi Jeff,
> 
> Yes I did. My question is about R_orderVector1 which is part of public R C
> api.
> Should I notice something relevant in the source of R's order?
> 
> Best
> Jan
> 
> On Sun, Sep 24, 2023, 17:27 Jeff Newmiller  wrote:
> 
>> Have you read the output of
>> 
>> order
>> 
>> entered at the R console?
>> 
>> 
>> On September 24, 2023 1:38:41 AM PDT, Jan Gorecki 
>> wrote:
>>> Dear pkg developers,
>>> 
>>> Are there any ways to check which sorting algorithm is being used when
>>> calling `order` function? Documentation at
>>> https://stat.ethz.ch/R-manual/R-devel/library/base/html/sort.html
>>> says it is radix for length < 2^31
>>> 
>>> On the other hand, I am using R_orderVector1, passing in double float
>>> smaller than 2^31. Short description of it states
>>> "Fast version of 1-argument case of R_orderVector".
>>> Should I expect R_orderVector1 follow the same algo as R's order()? If so
>>> it should be radix as well.
>>> 
>>> 
>> https://github.com/wch/r-source/blob/ed51d34ec195b89462a8531b9ef30b7b72e47204/src/main/sort.c#L1133
>>> 
>>> If there is no way to check sorting algo, could anyone describe which one
>>> R_orderVector1 uses, and if there is easy API to use different ones from
>> C?
>>> 
>>> Best Regards,
>>> Jan Gorecki
>>> 
>>>  [[alternative HTML version deleted]]
>>> 
>>> __
>>> R-package-devel@r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>> 
>> --
>> Sent from my phone. Please excuse my brevity.
>> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Question about Clang 17 Error

2023-09-27 Thread Simon Urbanek
It looks like a C++ run-time mismatch between what cmake is using to build the 
static library and what is used by R. Unfortunately, cmake hides the actual 
compiler calls so it's hard to tell the difference, but that setup relies on 
the correct sequence of library paths.

The rhub manually forces -stdlib=libc++ to all its CXX flags
https://github.com/r-hub/rhub-linux-builders/blob/master/fedora-clang-devel/Makevars
so it is quite different from the gannet tests-clang-trunk setup (also note the 
different library paths), but that's not something you can do universally in 
the package, because it strongly depends on the toolchain setup.

Cheers,
Simon


> On 28/09/2023, at 9:37 AM, Reed A. Cartwright  wrote:
> 
> I was unable to reproduce the error on the rhub's clang17 docker image.
> 
> I notice that the linking command is slightly different between systems.
> And this suggests that I need to find some way to get CRAN to pass -stdlib
> flag at the linking stage.
> 
> CRAN:
> /usr/local/clang17/bin/clang++ -std=gnu++17 -shared
> -L/usr/local/clang/lib64 -L/usr/local/clang17/lib -L/usr/local/gcc13/lib64
> -L/usr/local/lib64 -o rbedrock.so actors.o bedrock_leveldb.o dummy.o init.o
> key_conv.o nbt.o random.o subchunk.o support.o -L./leveldb-mcpe/build
> -pthread -lleveldb -lz
> 
> RHUB:
> clang++-17 -stdlib=libc++ -std=gnu++14 -shared -L/opt/R/devel/lib/R/lib
> -L/usr/local/lib -o rbedrock.so actors.o bedrock_leveldb.o dummy.o init.o
> key_conv.o nbt.o random.o subchunk.o support.o -L./leveldb-mcpe/build
> -pthread -lleveldb -lz -L/opt/R/devel/lib/R/lib -lR
> 
> On Wed, Sep 27, 2023 at 11:36 AM Gábor Csárdi 
> wrote:
> 
>> You might be able to reproduce it with the clang17 container here:
>> 
>> https://urldefense.com/v3/__https://r-hub.github.io/containers/__;!!IKRxdwAv5BmarQ!a5vkX68B5unua6_Zsh92b99AXfbJiewU7Mp0nqAKE0JDT8v3g2d08JZ8Yq_0ubp0j4GeTWWLjLVAN-FoLqhhk9c$
>> You can either run it directly or with the rhub2 package:
>> 
>> https://urldefense.com/v3/__https://github.com/r-hub/rhub2*readme__;Iw!!IKRxdwAv5BmarQ!a5vkX68B5unua6_Zsh92b99AXfbJiewU7Mp0nqAKE0JDT8v3g2d08JZ8Yq_0ubp0j4GeTWWLjLVAN-FoxPbUQlE$
>> 
>> Gabor
>> 
>> On Wed, Sep 27, 2023 at 8:29 PM Reed A. Cartwright
>>  wrote:
>>> 
>>> My package, RBedrock, is now throwing an error when compiled against
>>> Clang17. The error log is here:
>>> 
>>> 
>> https://urldefense.com/v3/__https://www.stats.ox.ac.uk/pub/bdr/clang17/rbedrock.log__;!!IKRxdwAv5BmarQ!a5vkX68B5unua6_Zsh92b99AXfbJiewU7Mp0nqAKE0JDT8v3g2d08JZ8Yq_0ubp0j4GeTWWLjLVAN-FoNhThuZA$
>>> 
>>> The important part is
>>> """
>>> Error: package or namespace load failed for ‘rbedrock’ in dyn.load(file,
>>> DLLpath = DLLpath, ...):
>>> unable to load shared object
>>> 
>> '/data/gannet/ripley/R/packages/tests-clang-trunk/rbedrock.Rcheck/00LOCK-rbedrock/00new/rbedrock/libs/rbedrock.so':
>>> 
>>> 
>> /data/gannet/ripley/R/packages/tests-clang-trunk/rbedrock.Rcheck/00LOCK-rbedrock/00new/rbedrock/libs/rbedrock.so:
>>> undefined symbol: _ZNSt3__122__libcpp_verbose_abortEPKcz
>>> Error: loading failed
>>> """
>>> 
>>> From what I can gather through googling, this error can be caused by
>> using
>>> the C linker when one of the dependent libraries is a C++ library.
>>> 
>>> I cannot tell if this is an issue with my package (likely) or CRAN's
>>> clang17 setup (less likely).
>>> 
>>> Background about the package: rbedrock is written in C but links against
>> a
>>> C++ library (Mojang's leveldb fork)  via the library's C-API functions. I
>>> use a dummy .cpp file in the source directory to trigger R into using the
>>> C++ linker. That does still seem to be happening according to the log.
>>> 
>>> Has anyone seen this before and know where I should start looking to fix
>> it?
>>> 
>>> Thanks.
>>> 
>>> --
>>> Reed A. Cartwright, PhD
>>> Associate Professor of Genomics, Evolution, and Bioinformatics
>>> School of Life Sciences and The Biodesign Institute
>>> Arizona State University
>>> ==
>>> Address: The Biodesign Institute, PO Box 876401, Tempe, AZ 85287-6401 USA
>>> Packages: The Biodesign Institute, 1001 S. McAllister Ave, Tempe, AZ
>>> 85287-6401 USA
>>> Office: Biodesign B-220C, 1-480-965-9949
>>> Website:
>> https://urldefense.com/v3/__http://cartwrig.ht/__;!!IKRxdwAv5BmarQ!a5vkX68B5unua6_Zsh92b99AXfbJiewU7Mp0nqAKE0JDT8v3g2d08JZ8Yq_0ubp0j4GeTWWLjLVAN-Fo7waq1VI$
>>> 
>>>[[alternative HTML version deleted]]
>>> 
>>> __
>>> R-package-devel@r-project.org mailing list
>>> 
>> https://urldefense.com/v3/__https://stat.ethz.ch/mailman/listinfo/r-package-devel__;!!IKRxdwAv5BmarQ!a5vkX68B5unua6_Zsh92b99AXfbJiewU7Mp0nqAKE0JDT8v3g2d08JZ8Yq_0ubp0j4GeTWWLjLVAN-FocAOfF7A$
>> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo

Re: [R-pkg-devel] CXX_VISIBILITY on macOS

2023-10-08 Thread Simon Urbanek
Matthias,

this has nothing to do with R, but rather your code. You have the wrong order 
of headers: the SWI headers mess up visibility macros, so you have to include 
them *after* Rcpp.h.

Cheers,
Simon


> On 9/10/2023, at 8:41 AM, Matthias Gondan  wrote:
> 
> Dear developers and CRAN people,
> 
> I get some linker warnings on the macOS build server,
> 
> ld: warning: direct access in function '…' from file '…' to global weak 
> symbol '…' from file '…' means the weak symbol cannot be overridden at 
> runtime. This was likely caused by different translation units being compiled 
> with different visibility settings.
> 
> Writing R Extensions (Section 6.16) says that visibility attributes are not 
> supported on macOS nor Windows. If I add $(CXX_VISIBILITY) to PKG_CXXFLAGS, 
> the warnings are still there, and I can see from the compiler log that the 
> flags do not have any effect on macOS. However, if I add -fvisibility=hidden 
> to PKG_CXXFLAGS, the warnings disappear (but I get a reminder from R CMD 
> check that -fvisibility=hidden is not portable). I am wondering if 
> $(CXX_VISIBILITY) could be supported on macOS.
> 
> Best wishes,
> 
> Matthias
> 
> This is the build log of the package: 
> https://www.r-project.org/nosvn/R.check/r-release-macos-arm64/rswipl-00install.html
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] What happened to mlr3proba on CRAN?

2023-10-08 Thread Simon Urbanek
Franz,

it means that the author(s) have abandoned the package: as the note says it was 
failing checks and the authors have not fixed the problems so it has been 
removed from CRAN (more than a year ago).

Cheers,
Simon



> On 9/10/2023, at 10:28 AM, Dr. Franz Király  wrote:
> 
> Dear all,
> 
> can someone explain to me what exactly happened to mlr3proba on CRAN?
> https://cran.r-project.org/web/packages/mlr3proba/index.html
> 
> Thanks
> Franz
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Question about Clang 17 Error

2023-10-11 Thread Simon Urbanek
Reed,

please contact CRAN - this list can only help with general developer's 
questions, not specific issues with a particular CRAN setup - only the 
corresponding member of CRAN running the setup can help. I don't see anything 
obvious - we can see that it's a mismatch of run-times between the cmake build 
and the R linking, but from the package alone it's not clear to me why.

Cheers,
Simon


> On Oct 12, 2023, at 3:51 PM, Reed A. Cartwright  
> wrote:
> 
> Update: I submitted a new version of the package, but it did not fix the 
> issue. The package has now been archived and I do not have access to the 
> error log output anymore from r-devel-linux-x86_64-fedora-clang.
> 
> I did reproduce CRAN's configuration in a VM using the information provided 
> by CRAN for r-devel-linux-x86_64-fedora-clang. I still cannot reproduce the 
> error and at this point I believe that there is a chance that CRAN's machine 
> is misconfigured.
> 
> The specific error happens after rbedrock has been compiled and linked 
> successfully. The specific error is that the symbol 
> _ZNSt3__122__libcpp_verbose_abortEPKcz cannot be found when rbedrock.so is 
> loaded.This symbol was introduced into libc++ in Clang 15.0. What I believe 
> to be happening to cause the error is that Clang++ 17 is adding a reference 
> to this symbol when compiling and linking rbedrock.so but the dynamic linker 
> is loading an older version of libc++.so when trying to load rbedrock.so and 
> the symbol is not found.
> 
> If this is the cause, then I think that the CRAN machine needs to configure 
> the dynamic linker to use the Clang++ 17 libc++.so, or add the proper command 
> line options to R's config variables.
> 
> It's possible that the CRAN's r-devel-linux-x86_64-fedora-clang machine is 
> fine and I've missed something, and I would be happy if someone could help me 
> figure out what it is.
> 
> Also, a new issue cropped up when 0.3.1 was tested on the 
> r-oldrel-macos-x86_64 machine. /usr/bin/ar seems to have failed to produce an 
> archive. The other Mac versions did fine, so I'm not sure if this is a random 
> error or something related to my package. The error log is here: 
> https://www.r-project.org/nosvn/R.check/r-oldrel-macos-x86_64/rbedrock-00install.html
>  
> <https://www.r-project.org/nosvn/R.check/r-oldrel-macos-x86_64/rbedrock-00install.html>
> 
> If anyone can help me resolve this, I'd appreciate it.
> 
> 
> On Wed, Sep 27, 2023 at 2:54 PM Reed A. Cartwright  <mailto:racartwri...@gmail.com>> wrote:
> Is there any way to submit packages directly to the CRAN's clang17 setup? I 
> can enable verbose output for CMake and compare the output, but I'd rather 
> not clog up the CRAN incoming queue just to debug a linker error?
> 
> On Wed, Sep 27, 2023 at 2:43 PM Simon Urbanek  <mailto:simon.urba...@r-project.org>> wrote:
> It looks like a C++ run-time mismatch between what cmake is using to build 
> the static library and what is used by R. Unfortunately, cmake hides the 
> actual compiler calls so it's hard to tell the difference, but that setup 
> relies on the correct sequence of library paths.
> 
> The rhub manually forces -stdlib=libc++ to all its CXX flags
> https://urldefense.com/v3/__https://github.com/r-hub/rhub-linux-builders/blob/master/fedora-clang-devel/Makevars__;!!IKRxdwAv5BmarQ!bAZgiOQaK4hd5BTk_Ldx9IEHgzHKVbC-uMkvYv5GOVkZDvbedcGwS8dQ4MWXRjukFfds7UpiR9NDZfEoUCWeoVnCfrDa$
>  
> <https://urldefense.com/v3/__https://github.com/r-hub/rhub-linux-builders/blob/master/fedora-clang-devel/Makevars__;!!IKRxdwAv5BmarQ!bAZgiOQaK4hd5BTk_Ldx9IEHgzHKVbC-uMkvYv5GOVkZDvbedcGwS8dQ4MWXRjukFfds7UpiR9NDZfEoUCWeoVnCfrDa$>
>  
> so it is quite different from the gannet tests-clang-trunk setup (also note 
> the different library paths), but that's not something you can do universally 
> in the package, because it strongly depends on the toolchain setup.
> 
> Cheers,
> Simon
> 
> 
> > On 28/09/2023, at 9:37 AM, Reed A. Cartwright  > <mailto:racartwri...@gmail.com>> wrote:
> > 
> > I was unable to reproduce the error on the rhub's clang17 docker image.
> > 
> > I notice that the linking command is slightly different between systems.
> > And this suggests that I need to find some way to get CRAN to pass -stdlib
> > flag at the linking stage.
> > 
> > CRAN:
> > /usr/local/clang17/bin/clang++ -std=gnu++17 -shared
> > -L/usr/local/clang/lib64 -L/usr/local/clang17/lib -L/usr/local/gcc13/lib64
> > -L/usr/local/lib64 -o rbedrock.so actors.o bedrock_leveldb.o dummy.o init.o
> > key_conv.o nbt.o random.o subchunk.o support.o -L./leveldb-mcpe/build
>

Re: [R-pkg-devel] Suppressing long-running vignette code in CRAN submission

2023-10-17 Thread Simon Urbanek
John,

the short answer is it won't work (it defeats the purpose of vignettes).

However, this sounds like a purely hypothetical question - CRAN policies allow 
long-running vignettes if they declared.

Cheers,
Simon


> On 18/10/2023, at 3:02 AM, John Fox  wrote:
> 
> Hello Dirk,
> 
> Thank you (and Kevin and John) for addressing my questions.
> 
> No one directly answered my first question, however, which was whether the 
> approach that I suggested would work. I guess that the implication is that it 
> won't, but it would be nice to confirm that before I try something else, 
> specifically using R.rsp.
> 
> Best,
> John
> 
> On 2023-10-17 4:02 a.m., Dirk Eddelbuettel wrote:
>> Caution: External email.
>> On 16 October 2023 at 10:42, Kevin R Coombes wrote:
>> | Produce a PDF file yourself, then use the "as.is" feature of the R.rsp
>> | package.
>> For completeness, that approach also works directly with Sweave. Described in
>> a blog post by Mark van der Loo in 2019, and used in a number of packages
>> including a few of mine.
>> That said, I also used the approach described by John Harrold and cached
>> results myself.
>> Dirk
>> --
>> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suppressing long-running vignette code in CRAN submission

2023-10-17 Thread Simon Urbanek
Dirk,

I think you misread the email - John was was asking specifically about his 
approach to use REBUILD_CV_VIGNETTES without any caching since that was the 
original question which no one answered in the thread - and that was what I was 
answering. The alternative approaches were already discussed to death so I 
didn't comment on those.

Cheers,
Simon



> On 18/10/2023, at 11:03 AM, Dirk Eddelbuettel  wrote:
> 
> 
> On 18 October 2023 at 08:51, Simon Urbanek wrote:
> | John,
> | 
> | the short answer is it won't work (it defeats the purpose of vignettes).
> 
> Not exactly. Everything is under our (i.e. package author) control, and when
> we want to replace 'computed' values with cached values we can.
> 
> All this is somewhat of a charade. "Of course" we want vignettes to run
> tests. But then we don't want to fall over random missing .sty files or fonts
> (macOS machines have been less forgiving than others), not to mention compile
> time.
> 
> So for simplicity I often pre-make pdf vignettes that get included in other
> latex code as source. Works great, never fails, CRAN never complained --
> which is somewhat contrary to your statement.
> 
> It is effectively the same with tests. We all want maximum test surfaces. But
> when tests fail, or when they run too long, or [insert many other reasons
> here] so many packages run tests conditionally.  Such is life.
> 
> Dirk
> 
> 
> | However, this sounds like a purely hypothetical question - CRAN policies 
> allow long-running vignettes if they declared.
> | 
> | Cheers,
> | Simon
> | 
> | 
> | > On 18/10/2023, at 3:02 AM, John Fox  wrote:
> | > 
> | > Hello Dirk,
> | > 
> | > Thank you (and Kevin and John) for addressing my questions.
> | > 
> | > No one directly answered my first question, however, which was whether 
> the approach that I suggested would work. I guess that the implication is 
> that it won't, but it would be nice to confirm that before I try something 
> else, specifically using R.rsp.
> | > 
> | > Best,
> | > John
> | > 
> | > On 2023-10-17 4:02 a.m., Dirk Eddelbuettel wrote:
> | >> Caution: External email.
> | >> On 16 October 2023 at 10:42, Kevin R Coombes wrote:
> | >> | Produce a PDF file yourself, then use the "as.is" feature of the R.rsp
> | >> | package.
> | >> For completeness, that approach also works directly with Sweave. 
> Described in
> | >> a blog post by Mark van der Loo in 2019, and used in a number of packages
> | >> including a few of mine.
> | >> That said, I also used the approach described by John Harrold and cached
> | >> results myself.
> | >> Dirk
> | >> --
> | >> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> | >> __
> | >> R-package-devel@r-project.org mailing list
> | >> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> | > 
> | > __
> | > R-package-devel@r-project.org mailing list
> | > https://stat.ethz.ch/mailman/listinfo/r-package-devel
> | > 
> | 
> | __
> | R-package-devel@r-project.org mailing list
> | https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Failing to write config file in linux

2023-10-23 Thread Simon Urbanek
>From CRAN policy (which you agreed to when you submitted your package) - note 
>in particular the "nor anywhere else on the file system" part and also note 
>that it tells you what to do in your case:


Packages should not write in the user’s home filespace (including clipboards), 
nor anywhere else on the file system apart from the R session’s temporary 
directory (or during installation in the location pointed to by TMPDIR: and 
such usage should be cleaned up). Installing into the system’s R installation 
(e.g., scripts to its bin directory) is not allowed.
Limited exceptions may be allowed in interactive sessions if the package 
obtains confirmation from the user.

For R version 4.0 or later (hence a version dependency is required or only 
conditional use is possible), packages may store user-specific data, 
configuration and cache files in their respective user directories obtained 
from tools::R_user_dir(), provided that by default sizes are kept as small as 
possible and the contents are actively managed (including removing outdated 
material).




> On Oct 23, 2023, at 5:52 AM, Keshav, Krishna  wrote:
> 
> Hi,
> 
> My package is failing on linux based systems because of an attempt to write 
> in a location of package. One of the core features that we would like user to 
> have is to modify the values in the config file, for which package has a 
> function for user to provide modified config. In future, they should be able 
> to provide individual parameters for the config for which also we will be 
> writing to config in package directory /inst/ so that it can later be 
> fetched. I understand that policy doesn’t allow writing to home directory. Is 
> there a workaround for this? Or what could be other potential solutions to 
> explore.
> 
> Snippet –
> https://github.com/GarrettLab/CroplandConnectivity/blob/923a4a0ca4a0ce8376068ee80986df228ea21d80/geohabnet/R/params.R#L57
> 
> Error –
> ── Failure ('test-parameters.R:38:3'): Test 6: Test to set new 
> parameters.yaml ──
> Expected `set_parameters(new_param_file)` to run without any conditions.
> ℹ Actually got a  with text:
> cannot create file 
> '/home/hornik/tmp/R.check/r-release-gcc/Work/build/Packages/geohabnet/parameters.yaml',
>  reason 'Read-only file system'
> 
> 
> Best Regards,
> Krishna Keshav
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Matrix and Mac OS

2023-10-30 Thread Simon Urbanek
Paul,

can you give us a bit more detail? Which package, which build and where you got 
the errors? Older builds may not have the latest Matrix.

Cheers,
Simon


> On 31/10/2023, at 11:26 AM, Bailey, Paul via R-package-devel 
>  wrote:
> 
> Hi,
> 
> I'm the maintainer for a few packages, one of which is currently failing CRAN 
> checks on Mac OS because Matrix is not available in my required version (the 
> latest). I had to fix a few things due to changes in the latest Matrix 
> package because of how qr works and I thought, given the apparent API change, 
> I should then require the latest version. My error is, "Package required and 
> available but unsuitable version: 'Matrix'"
> 
> When I look at the NEWS in Matrix there is no mention of Mac OS issues, what 
> the latest stable version of Matrix is, nor when a fix is expected. What 
> version do MacOS version test Matrix with by default? Where is this 
> documented? I assumes it always tested with the latest version on CRAN, so 
> I'm a bit surprised. Or will this be resolved soon and I shouldn't bother 
> CRAN maintainers with a new version of my package?
> 
> Best,
> Paul
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Matrix and Mac OS

2023-10-31 Thread Simon Urbanek
Mikael,

current Matrix fails checks on R-oldrel so that's why only the last working 
version is installed:
https://cran.r-project.org/web/checks/check_results_Matrix.html

Cheers,
Simon



> On 1/11/2023, at 4:05 AM, Mikael Jagan  wrote:
> 
> I am guessing that they mean EdSurvey:
> 
>https://cran.r-project.org/web/checks/check_results_EdSurvey.html
> 
> Probably Matrix 1.6-1.1 is not installed on r-oldrel-macos-arm64,
> even though it can be, because it was not released until R 4.3-z.
> 
> AFAIK, methods for 'qr' have not been touched since Matrix 1.6-0, and
> even those changes should have been backwards compatible, modulo handling
> of dimnames (class sparseQR gained a Dimnames slot in 1.6-0).
> 
> So I don't see a clear reason for requiring 1.6-1.1.  Requiring 1.6-0
> might make sense, if somehow EdSurvey depends on how class sparseQR
> preserves dimnames.  But IIRC our rev. dep. checks at that time did not
> reveal problems with EdSurvey.
> 
> Mikael
> 
> On 2023-10-31 7:00 am, r-package-devel-requ...@r-project.org wrote:
>> Paul,
>> can you give us a bit more detail? Which package, which build and where you 
>> got the errors? Older builds may not have the latest Matrix.
>> Cheers,
>> Simon
>>> On 31/10/2023, at 11:26 AM, Bailey, Paul via 
>>> R-package-devel  wrote:
>>> 
>>> Hi,
>>> 
>>> I'm the maintainer for a few packages, one of which is currently failing 
>>> CRAN checks on Mac OS because Matrix is not available in my required 
>>> version (the latest). I had to fix a few things due to changes in the 
>>> latest Matrix package because of how qr works and I thought, given the 
>>> apparent API change, I should then require the latest version. My error is, 
>>> "Package required and available but unsuitable version: 'Matrix'"
>>> 
>>> When I look at the NEWS in Matrix there is no mention of Mac OS issues, 
>>> what the latest stable version of Matrix is, nor when a fix is expected. 
>>> What version do MacOS version test Matrix with by default? Where is this 
>>> documented? I assumes it always tested with the latest version on CRAN, so 
>>> I'm a bit surprised. Or will this be resolved soon and I shouldn't bother 
>>> CRAN maintainers with a new version of my package?
>>> 
>>> Best,
>>> Paul
>>> 
>>> [[alternative HTML version deleted]]
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Matrix and Mac OS

2023-10-31 Thread Simon Urbanek
Mikael,

in that case I think your requirements are wrong - Matrix says R >= 3.5.0 which 
is apparently incorrect - from what you say it should be 4.2.2?. I can 
certainly update to 4.2.3 if necessary.

Cheers,
Simon



> On 1/11/2023, at 9:19 AM, Mikael Jagan  wrote:
> 
> Thanks.  We did see those ERRORs, stemming from use (since Matrix 1.6-0)
> of amsmath commands in Rd files.  These have been supported since R 4.2.2,
> but r-oldrel-macos-* (unlike r-oldrel-windows-*) continues to run R 4.2.0.
> My expectation was that those machines would begin running R >= 4.2.2 well
> before the R 4.4.0 release, but apparently that was wrong.
> 
> I am hesitant to complicate our Rd files with conditions on R versions
> only to support PDF output for R < 4.2.2, but maybe we can consider it
> for the Matrix 1.6-2 release if it is really a barrier for others ...
> 
> Mikael
> 
> On 2023-10-31 3:33 pm, Simon Urbanek wrote:
>> Mikael,
>> current Matrix fails checks on R-oldrel so that's why only the last working 
>> version is installed:
>> https://cran.r-project.org/web/checks/check_results_Matrix.html
>> Cheers,
>> Simon
>>> On 1/11/2023, at 4:05 AM, Mikael Jagan  wrote:
>>> 
>>> I am guessing that they mean EdSurvey:
>>> 
>>>https://cran.r-project.org/web/checks/check_results_EdSurvey.html
>>> 
>>> Probably Matrix 1.6-1.1 is not installed on r-oldrel-macos-arm64,
>>> even though it can be, because it was not released until R 4.3-z.
>>> 
>>> AFAIK, methods for 'qr' have not been touched since Matrix 1.6-0, and
>>> even those changes should have been backwards compatible, modulo handling
>>> of dimnames (class sparseQR gained a Dimnames slot in 1.6-0).
>>> 
>>> So I don't see a clear reason for requiring 1.6-1.1.  Requiring 1.6-0
>>> might make sense, if somehow EdSurvey depends on how class sparseQR
>>> preserves dimnames.  But IIRC our rev. dep. checks at that time did not
>>> reveal problems with EdSurvey.
>>> 
>>> Mikael
>>> 
>>> On 2023-10-31 7:00 am, r-package-devel-requ...@r-project.org wrote:
>>>> Paul,
>>>> can you give us a bit more detail? Which package, which build and where 
>>>> you got the errors? Older builds may not have the latest Matrix.
>>>> Cheers,
>>>> Simon
>>>>> On 31/10/2023, at 11:26 AM, Bailey, Paul via 
>>>>> R-package-devel  wrote:
>>>>> 
>>>>> Hi,
>>>>> 
>>>>> I'm the maintainer for a few packages, one of which is currently failing 
>>>>> CRAN checks on Mac OS because Matrix is not available in my required 
>>>>> version (the latest). I had to fix a few things due to changes in the 
>>>>> latest Matrix package because of how qr works and I thought, given the 
>>>>> apparent API change, I should then require the latest version. My error 
>>>>> is, "Package required and available but unsuitable version: 'Matrix'"
>>>>> 
>>>>> When I look at the NEWS in Matrix there is no mention of Mac OS issues, 
>>>>> what the latest stable version of Matrix is, nor when a fix is expected. 
>>>>> What version do MacOS version test Matrix with by default? Where is this 
>>>>> documented? I assumes it always tested with the latest version on CRAN, 
>>>>> so I'm a bit surprised. Or will this be resolved soon and I shouldn't 
>>>>> bother CRAN maintainers with a new version of my package?
>>>>> 
>>>>> Best,
>>>>> Paul
>>>>> 
>>>>>   [[alternative HTML version deleted]]
>>> 
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Can -mmacosx-version-min be raised to 10.15 ?

2023-11-16 Thread Simon Urbanek
Dirk,

can you clarify where the flags come from? The current CRAN builds 
(big-sur-x86_64 and big-sur-arm64) use

export SDKROOT=/Library/Developer/CommandLineTools/SDKs/MacOSX11.sdk
export MACOSX_DEPLOYMENT_TARGET=11.0

so the lowest target is 11.0 and it is no longer forced it in the flags (so 
that users can more easily choose their desired targets).

Cheers,
Simon



> On 17/11/2023, at 2:57 AM, Dirk Eddelbuettel  wrote:
> 
> 
> Hi Simon,
> 
> We use C++20 'inside' our library and C++17 in the API. Part of our C++17 use
> is now expanding to std::filesystem whose availability is dependent on the
> implementation. 
> 
> The compiler tells us (in a compilation using -mmacosx-version-min=10.14)
> that the features we want are only available with 10.15.
> 
> Would we be allowed to use this value of '10.15' on CRAN?
> 
> Thanks as always,  Dirk
> 
> 
> [1] 
> https://github.com/TileDB-Inc/TileDB/actions/runs/6882271269/job/18720444943?pr=4518#step:7:185
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Can -mmacosx-version-min be raised to 10.15 ?

2023-11-16 Thread Simon Urbanek
Dirk,


> On 17/11/2023, at 10:28 AM, Dirk Eddelbuettel  wrote:
> 
> 
> Simon,
> 
> On 17 November 2023 at 09:35, Simon Urbanek wrote:
> | can you clarify where the flags come from? The current CRAN builds 
> (big-sur-x86_64 and big-sur-arm64) use
> | 
> | export SDKROOT=/Library/Developer/CommandLineTools/SDKs/MacOSX11.sdk
> | export MACOSX_DEPLOYMENT_TARGET=11.0
> | 
> | so the lowest target is 11.0 and it is no longer forced it in the flags (so 
> that users can more easily choose their desired targets).
> 
> Beautiful, solves our issue.  Was that announced at some point? If so, where?
> 

I don't see what is there to announce as the packages should be simply using 
flags passed from R and that process did not change.

That said, the binary target for CRAN has been announced on this list as part 
of the big-sur build announcement:
https://stat.ethz.ch/pipermail/r-sig-mac/2023-April/014731.html


> For reference the R-on-macOS FAQ I consulted still talks about 10.13 at
> https://cran.r-project.org/bin/macosx/RMacOSX-FAQ.html#Installation-of-source-packages
> 
>  CC = clang -mmacosx-version-min=10.13
>  CXX = clang++ -mmacosx-version-min=10.13 -std=gnu++14
>  FC = gfortran -mmacosx-version-min=10.13
>  OBJC = clang -mmacosx-version-min=10.13
>  OBJCXX = clang++ -mmacosx-version-min=10.13
> 
> so someone may want to refresh this. It is what I consulted as relevant info.
> 

It says "Look at file /Library/Frameworks/R.framework/Resources/etc/Makeconf" 
so it is just an example that will vary by build. For example big-sur-arm64 
will give you

$ grep -E '^(CC|CXX|FC|OBJC|OBJCXX) ' 
/Library/Frameworks/R.framework/Resources/etc/Makeconf
CC = clang -arch arm64
CXX = clang++ -arch arm64 -std=gnu++14
FC = /opt/R/arm64/bin/gfortran -mtune=native
OBJC = clang -arch arm64
OBJCXX = clang++ -arch arm64

Again, this is just an example, no one should be entering such flags by hand - 
that's why they are in Makeconf so packages can use them without worrying about 
the values (see R-exts 1.2: 
https://cran.r-project.org/doc/manuals/R-exts.html#Configure-and-cleanup for 
details).

Cheers,
Simon



> Thanks, Dirk
> 
> | 
> | Cheers,
> | Simon
> | 
> | 
> | 
> | > On 17/11/2023, at 2:57 AM, Dirk Eddelbuettel  wrote:
> | > 
> | > 
> | > Hi Simon,
> | > 
> | > We use C++20 'inside' our library and C++17 in the API. Part of our C++17 
> use
> | > is now expanding to std::filesystem whose availability is dependent on the
> | > implementation. 
> | > 
> | > The compiler tells us (in a compilation using -mmacosx-version-min=10.14)
> | > that the features we want are only available with 10.15.
> | > 
> | > Would we be allowed to use this value of '10.15' on CRAN?
> | > 
> | > Thanks as always,  Dirk
> | > 
> | > 
> | > [1] 
> https://github.com/TileDB-Inc/TileDB/actions/runs/6882271269/job/18720444943?pr=4518#step:7:185
> | > 
> | > -- 
> | > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> | > 
> | > __
> | > R-package-devel@r-project.org mailing list
> | > https://stat.ethz.ch/mailman/listinfo/r-package-devel
> | > 
> | 
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Cryptic error on Windows but not Debian

2023-11-18 Thread Simon Urbanek
Adam,


> On Nov 19, 2023, at 9:39 AM, Adam  wrote:
> 
> Dear Ivan,
> 
> Thank you for explaining in such depth. I had not submitted to CRAN before.
> I will look into tools::R_user_dir().
> 
> - May you point me toward the policy that the package should not edit 
> .Renviron?


It is the policy you have agreed to when submitting your package to CRAN:

"CRAN Repository Policy
[...]
The code and examples provided in a package should never do anything which 
might be regarded as malicious or anti-social. The following are illustrative 
examples from past experience.
[...]
 - Packages should not write in the user’s home filespace (including 
clipboards), nor anywhere else on the file system apart from the R session’s 
temporary directory. [...]
  For R version 4.0 or later (hence a version dependency is required or only 
conditional use is possible), packages may store user-specific data, 
configuration and cache files in their respective user directories obtained 
from tools::R_user_dir(), provided that by default sizes are kept as small as 
possible and the contents are actively managed (including removing outdated 
material).
"

Cheers,
Simon

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Cryptic error on Windows but not Debian

2023-11-18 Thread Simon Urbanek
Adam,

no, it is your code in mm_authorize() that violates the CRAN policy, it is not 
about the test. You may not touch user's .Renviron and there is no reason to 
resort to such drastic measures. If you want to cache user's credentials, you 
have to do it in a file located via tools::R_user_dir().

Cheers,
Simon


> On Nov 19, 2023, at 12:07 PM, Adam  wrote:
> 
> Thank you dearly, Simon, for pointing out the policy. May a test do the 
> following?
> 
> 1. Save the user's original value for env var X.
> 2. Write a new value for env var X during a test.
> 3. Write back the original value for env var X at the end of the test.
> 
> An example:
> 
> test_that("mm_authorize() sets credentials", {
>   skip_on_cran()
>   key_original <- mm_key()
>   url_original <- mm_url()
>   withr::defer({
> mm_authorize(
>   key = key_original,
>   url = url_original,
>   overwrite = TRUE
> )
>   })
>   mm_authorize(
> key = "1",
> url = "https://api.megamation.com/uw/joe/ 
> <https://api.megamation.com/uw/joe/>",
> overwrite = TRUE
>   )
>   expect_false(
> endsWith(Sys.getenv("MEGAMATION_URL"), "/")
>   )
> })
> 
> Best,
> Adam
> 
> 
> On Sat, Nov 18, 2023 at 4:52 PM Simon Urbanek  <mailto:simon.urba...@r-project.org>> wrote:
> Adam,
> 
> 
> > On Nov 19, 2023, at 9:39 AM, Adam  > <mailto:asebsadow...@gmail.com>> wrote:
> > 
> > Dear Ivan,
> > 
> > Thank you for explaining in such depth. I had not submitted to CRAN before.
> > I will look into tools::R_user_dir().
> > 
> > - May you point me toward the policy that the package should not edit 
> > .Renviron?
> 
> 
> It is the policy you have agreed to when submitting your package to CRAN:
> 
> "CRAN Repository Policy
> [...]
> The code and examples provided in a package should never do anything which 
> might be regarded as malicious or anti-social. The following are illustrative 
> examples from past experience.
> [...]
>  - Packages should not write in the user’s home filespace (including 
> clipboards), nor anywhere else on the file system apart from the R session’s 
> temporary directory. [...]
>   For R version 4.0 or later (hence a version dependency is required or only 
> conditional use is possible), packages may store user-specific data, 
> configuration and cache files in their respective user directories obtained 
> from tools::R_user_dir(), provided that by default sizes are kept as small as 
> possible and the contents are actively managed (including removing outdated 
> material).
> "
> 
> Cheers,
> Simon
> 


[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Cryptic error on Windows but not Debian

2023-11-18 Thread Simon Urbanek
Chris,

this was not a change in interpretation, but rather CRAN's tools have gotten 
better at detecting such bad behavior.

I would like to point out that there is absolutely no reason to mangle user's 
.Renviron since the package can always set any environment variables it needs 
from its (legally managed) configuration files (as described in the policy) on 
load.

Regarding the approach you suggested, personally, I think it is bad - no 
package should be touching personal configuration files - it's entirely 
unnecessary and dangerous (e.g., your code will happily remove the content of 
the file on write error losing all user's values - that's why you never write 
important files directly, but rather create a copy which you atomically move in 
place *after* you know the content is correct).

Cheers,
Simon



> On Nov 19, 2023, at 12:40 PM, Kenny, Christopher 
>  wrote:
> 
> Rather than using tools::R_user_dir(), you can also ask the user for a path 
> where they would like to save the information to. This allows you to test it 
> with a temporary directory file, but would allow the user to specify their 
> .Renviron file, if they so choose. This acts as a middle ground managing a 
> separate package-specific file and storing it as an environmental variable. 
> This is how I approach it in a handful of packages, including `feltr` (see 
> https://github.com/christopherkenny/feltr/blob/HEAD/R/felt_key.R) and `bskyr` 
> (see https://github.com/christopherkenny/bskyr/blob/main/R/auth_user.R).
> 
> For what it's worth, some of this confusion may come from a relatively recent 
> change in interpretation of the policy mentioned below by Simon (even though 
> the text has long read that way). For years, CRAN allowed packages which had 
> the practice of opting into writing to the default .Renviron file. That old 
> reading comports with the example you point to in qualtRics, where the 
> writing is controlled by the `install` argument, with a default of FALSE. 
> Since sometime in the last year, the interpretation was updated and you are 
> now met with a message from the volunteer which states:
> "Please ensure that your functions do not write by default or in your 
> examples/vignettes/tests in the user's home filespace (including the package 
> directory and getwd()). This is not allowed by CRAN policies.
> Please omit any default path in writing functions. In your 
> examples/vignettes/tests you can write to tempdir()."
> 
> The approach used in `feltr` and other packages to explicitly require a path 
> as an argument appears to be okay with the new reading of the policy. (At 
> least, the CRAN volunteers seem to accept packages which use this approach.)
> 
> Best,
> Chris
> 
> 
> From: R-package-devel  on behalf of 
> Simon Urbanek 
> Sent: Saturday, November 18, 2023 6:14 PM
> To: Adam 
> Cc: r-package-devel@r-project.org 
> Subject: Re: [R-pkg-devel] Cryptic error on Windows but not Debian 
>  
> Adam,
> 
> no, it is your code in mm_authorize() that violates the CRAN policy, it is 
> not about the test. You may not touch user's .Renviron and there is no reason 
> to resort to such drastic measures. If you want to cache user's credentials, 
> you have to do it in a file located via tools::R_user_dir().
> 
> Cheers,
> Simon
> 
> 
>> On Nov 19, 2023, at 12:07 PM, Adam  wrote:
>> 
>> Thank you dearly, Simon, for pointing out the policy. May a test do the 
>> following?
>> 
>> 1. Save the user's original value for env var X.
>> 2. Write a new value for env var X during a test.
>> 3. Write back the original value for env var X at the end of the test.
>> 
>> An example:
>> 
>> test_that("mm_authorize() sets credentials", {
>>skip_on_cran()
>>key_original <- mm_key()
>>url_original <- mm_url()
>>withr::defer({
>>  mm_authorize(
>>key = key_original,
>>url = url_original,
>>    overwrite = TRUE
>>  )
>>})
>>mm_authorize(
>>  key = "1",
>>  url = "https://api.megamation.com/uw/joe/ 
>> <https://api.megamation.com/uw/joe/>",
>>  overwrite = TRUE
>>)
>>expect_false(
>>  endsWith(Sys.getenv("MEGAMATION_URL"), "/")
>>)
>> })
>> 
>> Best,
>> Adam
>> 
>> 
>> On Sat, Nov 18, 2023 at 4:52 PM Simon Urbanek > <mailto:simon.urba...@r-project.org>> wrote:
>> Adam,
>> 
>> 
>>> On Nov 19, 2023, at 9:39 AM, Adam >> <mailto:asebsadow...@gmail.com>> wrote:
>>> 
>>> Dear Ivan,
>>

Re: [R-pkg-devel] macos x86 oldrel backups?

2023-12-05 Thread Simon Urbanek
Jon,

The high-sierra build packages are currently not built due to hardware issues. 
The macOS version is so long out of support by Apple (over 6 years) that it is 
hard to maintain it. Only big-sur builds are supported at this point. Although 
it is possible that we may be able to restore the old builds, it is not 
guaranteed. (BTW the right mailing list for this is R-SIG-Mac).

Cheers,
Simon



> On 5/12/2023, at 09:52, Jonathan Keane  wrote:
> 
> Thank you to the CRAN maintainers for maintenance and keeping the all
> of the CRAN infrastructure running.
> 
> I'm seeing a long delay in builds on CRAN for r-oldrel-macos-x86_64.
> I'm currently interested in Arrow [1], but I'm seeing many other
> packages with similar missing r-oldrel-macos-x86_64 builds (possibly
> all, I sampled a few packages from [2], but didn't do an exhaustive
> search) for an extended period.
> 
> It appears that this started between 2023-10-21 and 2023-10-22. It
> looks like AMR [3] has a successful build but xlcutter does not [4]
> and all the packages I've checked after 2023-10-22 don't have an
> updated build for r-oldrel-macos-x86_64
> 
> Sorry if this is scheduled maintenance, I tried to find an
> announcement here and on r-project.org but haven't yet found anything
> indicating this.
> 
> [1] - https://cran.r-project.org/web/checks/check_results_arrow.html
> [2] - https://cran.r-project.org/web/packages/available_packages_by_date.html
> [3] - https://cran.r-project.org/web/packages/AMR/index.html
> [4] - https://cran.r-project.org/web/packages/xlcutter/index.html
> 
> -Jon
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Status of -mmacosx-version-min

2023-12-09 Thread Simon Urbanek
As discussed here before packages should *never* set -mmacosx-version-min or 
similar flags by hand. As documented in R-exts 1.2 packages should retrieve 
compiler flags from R (this includes compiling 3rd party dependencies). 
Incidentally, older versions of R have included -mmacosx-version-min in the CC 
setting (high-sierra builds) which current ones don't, but complying packages 
will automatically use the correct compilers and flags regardless of the R 
version and build. Note that this is important as R can be built for different 
systems and targets so the package should not assume anything - just ask R.

The implied question was about the target macOS version for CRAN binaries: it 
is always included in the build name, so high-sierra build was targeting macOS 
10.13 (High Sierra) and big-sur build is targeting macOS 11 (Big Sur). It is 
clearly stated next to the download for each macOS R binary on CRAN: 
https://cran.r-project.org/bin/macosx/ where the current releases target macOS 
11.

Anyone distributing macOS binaries should subscribe to R-SIG-Mac where we 
discuss details such as repository locations etc. Developers writing packages 
just need to pay attention to R-exts and CRAN policies (the latter if they want 
to publish on CRAN). I hope this is now clear enough explanation.

Cheers,
Simon


> On Dec 10, 2023, at 10:07 AM, Dirk Eddelbuettel  wrote:
> 
> 
> Last month, I had asked about the setting '-mmacosx-version-min' here.  The
> setting can be used to specify what macOS version one builds for. It is,
> oddly enough, not mentioned in Writing R Extension but for both r-release and
> r-devel the R Administration manual states
> 
>   • Current CRAN macOS distributions are targeted at Big Sur so it is
> wise to ensure that the compilers generate code that will run on
> Big Sur or later.  With the recommended compilers we can use
>  CC="clang -mmacosx-version-min=11.0"
>  CXX="clang++ -mmacosx-version-min=11.0"
>  FC="/opt//gfortran/bin/gfortran -mmacosx-version-min=11.0"
> or set the environment variable
>  export MACOSX_DEPLOYMENT_TARGET=11.0
> 
> which is clear enough. (There is also an example in the R Internals manual
> still showing the old (and deprecated ?) value of 10.13.)  It is also stated
> at the top of mac.r-project.org.  But it is still in a somewhat confusing
> contradiction to the matrix of tests machines, described e.g. at
> 
>   https://cran.r-project.org/web/checks/check_flavors.html
> 
> which still has r-oldrel-macos-x86_64 with 10.13.
> 
> I found this confusing, and pressed the CRAN macOS maintainer to clarify but
> apparently did so in an insuffciently convincing manner. (There was a word
> about it being emailed to r-sig-mac which is a list I am not on as I don't
> have a macOS machine.) So in case anybody else wonders, my hope is that the
> above is of help. At my day job, we will now switch to 11.0 to take advantage
> of some more recent C++ features.
> 
> Dirk
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Wrong mailing list: Could the 100 byte path length limit be lifted?

2023-12-12 Thread Simon Urbanek
Justin,

now that you clarified what you are actually talking about, this is a question 
about the CRAN policies, so you should really direct it to the CRAN team as it 
is their decision (R-devel would be appropriate if this was a limitation in R 
itself, and R-package-devel would be appropriate if you wanted help with 
refactoring to adhere to the policy). There are still path limits on various 
platforms (even if they are becoming more rare), so I'd personally question the 
source rather than the policy, but then your email was remarkably devoid of any 
details.

Cheers,
Simon


> On Dec 13, 2023, at 6:03 AM, McGrath, Justin M  wrote:
> 
> When submitting a package to CRAN, it is required that path names be shorter 
> than 100 bytes, with the reason that paths longer than that cannot be made 
> into portable tar files. This error is reported by `R CMD check --as-cran`. 
> Since this pertains only to developing packages, this seemed like the 
> appropriate list, but if you don't think so, I can instead ask on R-devel.
> 
> Best wishes,
> Justin
> 
> 
> From: Martin Maechler 
> Sent: Tuesday, December 12, 2023 10:13 AM
> To: McGrath, Justin M
> Cc: r-package-devel@r-project.org
> Subject: Wrong mailing list: [R-pkg-devel] Could the 100 byte path length 
> limit be lifted?
> 
>> McGrath, Justin M
>>on Tue, 12 Dec 2023 15:03:28 + writes:
> 
>> We include other software in our source code. It has some long paths so a 
>> few of the files end up with paths longer than 100 bytes, and we need to 
>> manually rename them whenever we pull in updates.
>> The 100 byte path limit is from tar v7, and since
>> POSIX1.1988, there has not been a path length limit. That
>> standard is 35 years old now, so given that there is
>> probably no one using an old version of tar that also
>> wants to use the latest version of R, could the 100 byte
>> limit be lifted? Incidentally, I am a big proponent of
>> wide, long-term support, but it's hard to see that this
>> change would negatively impact anyone.
> 
>> Best wishes,
>> Justin
> 
> Wrong mailing list:
> 
> This is a topic for R-devel,  not at all R-package-devel,
> but be more accurate in what you are talking about, only between
> the line I could read that it is about some variants of using
> 'tar'.
> 
> Best regards,
> Martin
> ---
> 
> Martin Maechler
> ETH Zurich  and  R Core team
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] portability question

2023-12-20 Thread Simon Urbanek
Steven,

no, I'm not aware of any negative effect, in fact having an index in the 
archive is always a good idea - some linkers require it, some work faster with 
it and at the worst the linker ignores it. And as far as I can tell all current 
system "ar" implementations support the -s flag (even though technically, it's 
only part of the XSI POSIX extension, but POSIX doesn't define ranlib so ar -s 
is better than using ranlib directly).

Cheers,
Simon


> On 21/12/2023, at 8:10 AM, Steven Scott  wrote:
> 
> The Boom package builds a library against which other packages link.  The
> library is built using the Makevars mechanism using the line
> 
> ${AR} rc $@ $^
> 
> A user has asked me to change 'rc' to 'rcs' so that 'ranlib' will be run on
> the archive.  This is apparently needed for certain flavors of macs.  I'm
> hoping someone on this list can comment on the portability of that change
> and whether it would negatively affect other platforms.  Thank you.
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] portability question

2023-12-20 Thread Simon Urbanek
This has nothing to do with Steven's question since he is creating a *static* 
library whereas install_name_tool changes install name ID entry of a *dynamic* 
library. Also the data.table example is effectively a no-op, because changing 
the ID makes no difference as it can't be linked against directly anyway. [In 
general, taking advice from data.table regarding macOS doesn't strike me as 
wise given that the authors can't even get their package to work properly on 
macOS.]

Cheers,
Simon


> On 21/12/2023, at 8:42 AM, Dirk Eddelbuettel  wrote:
> 
> 
> On 20 December 2023 at 11:10, Steven Scott wrote:
> | The Boom package builds a library against which other packages link.  The
> | library is built using the Makevars mechanism using the line
> | 
> | ${AR} rc $@ $^
> | 
> | A user has asked me to change 'rc' to 'rcs' so that 'ranlib' will be run on
> | the archive.  This is apparently needed for certain flavors of macs.  I'm
> | hoping someone on this list can comment on the portability of that change
> | and whether it would negatively affect other platforms.  Thank you.
> 
> Just branch for macOS.  Here is a line I 'borrowed' years ago from data.table
> and still use for packages needed to call install_name_tool on macOS.  You
> could have a simple 'true' branch of the test use 'rcs' and the 'false'
> branch do what you have always done.  Without any portability concerns.
> 
> From https://github.com/Rdatatable/data.table/blob/master/src/Makevars.in#L14
> and indented here for clarity
> 
>if [ "$(OS)" != "Windows_NT" ] && [ `uname -s` = 'Darwin' ]; then \
>   install_name_tool -id data_table$(SHLIB_EXT) 
> data_table$(SHLIB_EXT); \
>fi
> 
> Dirk
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] checking CRAN incoming feasibility

2024-01-16 Thread Simon Urbanek
Ralf,

that check always hangs for me (I don't think it likes NZ ;)), so I just use

_R_CHECK_CRAN_INCOMING_REMOTE_=0 R CMD check --as-cran ...

Cheers,
Simon


> On Jan 16, 2024, at 6:49 PM, Rolf Turner  wrote:
> 
> 
> On Tue, 16 Jan 2024 16:24:59 +1100
> Hugh Parsonage  wrote:
> 
>>> Surely the software just has to check
>> that there is web connection to a CRAN mirror.
>> 
>> Nope! The full code is in tools:::.check_package_CRAN_incoming  (the
>> body of which filled up my entire console), but to name a few checks
>> it has to do: check that the name of the package is not the same as
>> any other, including archived packages (which means that it has to
>> download the package metadata), make sure the licence is ok, see if
>> the version number is ok. 10 minutes is quite a lot though. I suspect
>> the initial connection may have been faulty.
> 
> Well, it may not have been 10 minutes, but it was at least 5.  The
> problem is persistent/repeatable.  I don't believe that there is any
> faulty connection.
> 
> Thanks for the insight.
> 
> cheers,
> 
> Rolf Turner
> 
> -- 
> Honorary Research Fellow
> Department of Statistics
> University of Auckland
> Stats. Dep't. (secretaries) phone:
> +64-9-373-7599 ext. 89622
> Home phone: +64-9-480-4619
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] test failure: oldrel

2024-01-16 Thread Simon Urbanek



> On Jan 17, 2024, at 3:46 AM, Josiah Parry  wrote:
> 
> Hey folks! I've received note that a package of mine is failing tests on
> oldrel.
> 
> Check results:
> https://www.r-project.org/nosvn/R.check/r-oldrel-windows-x86_64/arcgisutils-00check.html
> 
> I think I've narrowed it down to the way that I've written the test which
> uses `as.POSIXct(Sys.Date(), tz = "UTC")`.
> 

That's not where it fails - it fails in

today <- Sys.Date()
today_ms <- date_to_ms(today)
as.POSIXct(today_ms / 1000)

which is equivalent to

as.POSIXct(as.numeric(Sys.Date()) * 86400)

and that is only supported since R 4.3.0 - from NEWS:

  as.POSIXct() and as.POSIXlt(.) (without specifying origin) now work.

so you have to add R >= 4.3.0 or use .POSIXct() instead.

I didn't check your other tests so you may have more of the same ...

Cheers,
Simon


> If I understand the R release changelog correctly, this behavior did not
> exist prior to R 4.3.0.
> 
> as.POSIXlt() now does apply a tz (time zone) argument, as does
>> as.POSIXct(); partly suggested by Roland Fuß on the R-devel mailing list.
> 
> 
> https://cran.r-project.org/doc/manuals/r-release/NEWS.html
> 
> Does this check out? If so, would be more effective to modify the test to
> use a the character method of `as.POSIXct()`?
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] CMake on CRAN Systems

2024-01-17 Thread Simon Urbanek
I had a quick look and that package (assuming it's 
https://github.com/stsds/MPCR) does not adhere to any rules from R-exts (hence 
the removal from CRAN I presume) so the failure to detect cmake is the least 
problem. I would strongly recommend reading the  R documentation as cmake is 
just the wrong tool for the job in this case. R already has a fully working 
build system which will compile the package using the correct flags and tools - 
you only need to provide the C++ sources. You cannot generate the package 
shared object with cmake by definition - you must let R build it. [In rare case 
dependent static libraries are sometimes built with cmake inside the package if 
there is no other option and cmake is used upstream, but those are rare and you 
still have to use R to build the final shared object].

Cheers,
Simon


> On Jan 17, 2024, at 8:54 PM, Ivan Krylov via R-package-devel 
>  wrote:
> 
> Dear Sameh,
> 
> Regarding your question about the MPCR package and the use of CMake
> :
> on a Mac, you have to look for the cmake executable in more than one
> place because it is not guaranteed to be on the $PATH. As described in
> Writing R Extensions
> , the
> following is one way to work around the problem:
> 
> if test -z "$CMAKE"; then CMAKE="`which cmake`"; fi
> if test -z "$CMAKE"; then
> CMAKE=/Applications/CMake.app/Contents/bin/cmake;
> fi
> if test -f "$CMAKE"; then echo "no ‘cmake’ command found"; exit 1; fi
> 
> Please don't reply to existing threads when starting a new topic on
> mailing lists. Your message had a mangled link that went to
> urldefense.com instead of cran-archive.r-project.org, letting Amazon
> (who host the website) know about every visit to the link:
> https://stat.ethz.ch/pipermail/r-package-devel/2024q1/010328.html
> 
> -- 
> Best regards,
> Ivan
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R-package-devel Digest, Vol 105, Issue 19

2024-01-24 Thread Simon Urbanek
This is a reminder why one should never build packages directly in their source 
directory since it can only be done once (for packages with native source code) 
- always use

R CMD build --no-build-vignettes foo && R CMD INSTALL foo_*.tar.gz

if you plan to edit files in the source directory and re-use it.

Cheers,
Simon


> On 25/01/2024, at 8:34 AM, Carl Schwarz  wrote:
> 
> Solved...
> 
> The src/ directory also included a .o and .so objects from the last build
> of the package that must be "out of date" because once I removed those
> older objects, the Build -> Document and build -> Check package now work
> fine without crashing...A newer version of the .o and .so objects are now
> built and it now works fine.
> 
> Thanks
> Carl Schwarz
> 
> On Wed, Jan 24, 2024 at 10:57 AM Carl Schwarz 
> wrote:
> 
>> Thanks for your suggestions.  I followed the suggestion in today's message
>> (see results below) which all run without issue.
>> I tried to isolate the problem more
>> 
>> The issue appears to be with load_dll()
>> 
>> When I try
>>> getwd()
>> [1] "/Users/cschwarz/Library/CloudStorage/Dropbox/SPAS-R/SPAS"
>>> load_dll()
>> 
>> It crashes.
>> 
>> 
>> I moved the package outside of CloudStorage to see if that is the issue.
>>> getwd()
>> [1] "/Users/cschwarz/Desktop/SPAS"
>>> load_dll()
>> 
>> It crashes.
>> 
>> 
>> I tried doing a dll_load() where there is NO c++ object, i.e. a random
>> directory and it terminates with a sensible error message
>>> setwd("/Users/cschwarz/Desktop/BikePics")
>>> library(pkgload)
>>> load_dll()
>> Error in `value[[3L]]()`:
>> ! Could not find a root 'DESCRIPTION' file that starts with '^Package' in
>> /Users/cschwarz/Desktop/BikePics.
>> ℹ Are you in your project directory and does your project have a
>> 'DESCRIPTION' file?
>> Run `rlang::last_trace()` to see where the error occurred.
>> 
>> I'm following the suggestions on including TMB code in a package at
>> 
>> https://stackoverflow.com/questions/48627069/guidelines-for-including-tmb-c-code-in-an-r-package
>> and appear to have all the necessary files
>> 
>> I created my own load_dll() function by copying over the code and adding a
>> browser().
>> It appears to run fine until the statement library.dynam2(path.lib) where
>> it cannot find the function library.dynam2
>> 
>> 
>>> my_load_dll()
>> Called from: my_load_dll()
>> Browse[1]> n
>> debug at #4: package <- pkg_name(path)
>> Browse[2]> n
>> debug at #5: env <- ns_env(package)
>> Browse[2]> n
>> debug at #6: nsInfo <- parse_ns_file(path)
>> Browse[2]> n
>> debug at #7: dlls <- list()
>> Browse[2]> n
>> debug at #8: dynLibs <- nsInfo$dynlibs
>> Browse[2]> n
>> debug at #9: nativeRoutines <- list()
>> Browse[2]> n
>> debug at #10: for (i in seq_along(dynLibs)) {
>>lib <- dynLibs[i]
>>dlls[[lib]] <- library.dynam2(path, lib)
>>routines <- assignNativeRoutines(dlls[[lib]], lib, env,
>> nsInfo$nativeRoutines[[lib]])
>>nativeRoutines[[lib]] <- routines
>>if (!is.null(names(nsInfo$dynlibs)) &&
>> nzchar(names(nsInfo$dynlibs)[i]))
>>env[[names(nsInfo$dynlibs)[i]]] <- dlls[[lib]]
>>setNamespaceInfo(env, "DLLs", dlls)
>> }
>> Browse[2]> n
>> debug at #11: lib <- dynLibs[i]
>> Browse[2]> n
>> debug at #12: dlls[[lib]] <- library.dynam2(path, lib)
>> Browse[2]> n
>> Error in library.dynam2(path, lib) :
>>  could not find function "library.dynam2"
>> 
>> I'm unable to find where the library.dynam2() function lies... A google
>> search for library.dynam2 doesn't show anything except for a cryptic
>> comment in
>> https://rdrr.io/cran/pkgload/src/R/load-dll.R
>> which says
>> 
>> ## The code below taken directly from base::loadNamespace
>>  ## 
>> https://github.com/wch/r-source/blob/tags/R-3-3-0/src/library/base/R/namespace.R#L466-L485
>>  ## except for the call to library.dynam2, which is a special version of
>>  ## library.dynam
>> 
>> This is now beyond my pay grade..
>> 
>> Suggestions?
>> 
>> 
>> --
>> 
>> From James Lamb 
>> 
>> Using the shell:
>> 
>> R CMD build .
>> - success with
>> 
>> * checking for file ‘./DESCRIPTION’ ... OK
>> 
>> * preparing ‘SPAS’:
>> 
>> * checking DESCRIPTION meta-information ... OK
>> 
>> * cleaning src
>> 
>> * installing the package to build vignettes
>> 
>> * creating vignettes ... OK
>> 
>> * cleaning src
>> 
>> * checking for LF line-endings in source and make files and shell scripts
>> 
>> * checking for empty or unneeded directories
>> 
>> * building ‘SPAS_2024.1.31.tar.gz’
>> 
>> 
>> R CMD INSTALL --with-keep.source ./SPAS_*.tar.gz
>> - success. Lots of warning from the C compiler but appears to terminate
>> successfully with
>> 
>> 
>> installing to /Users/cschwarz/Rlibs/00LOCK-SPAS/00new/SPAS/libs
>> 
>> ** R
>> 
>> ** inst
>> 
>> ** byte-compile and prepare package for lazy loading
>> 
>> ** help
>> 
>> *** installing help indices
>> 
>> ** building package indices
>> 
>> ** installing vignettes
>> 
>> ** testing if installed package can be loaded from tempor

Re: [R-pkg-devel] Possible malware(?) in a vignette

2024-01-25 Thread Simon Urbanek
Iñaki,

I think you got it backwards in your conclusions: CRAN has not generated that 
PDF file (and Windows machines are not even involved here), it is the contents 
of a contributed package, so CRAN itself is not compromised. Also it is far 
from clear that it is really a malware - in fact it's certainly NOT what the 
website you linked claims as those tags imply trojans disguising ZIPped 
executables as PDF, but the file is an actual valid PDF and not even remotely a 
ZIP file (in fact is it consistent with pdflatex output). I looked at the 
decompressed payload of the PDF and the only binary payload are embedded fonts 
so my guess would be that some byte sequence in the fonts gets detected as 
false-positive trojan, but since there is no detail on the report we can just 
guess. False-positives are a common problem and this would not be the first 
one. Further indication that it's a false-positive is that a simple 
re-packaging the streams (i.e. NOT changing the actual PDF contents) make the 
same file pass the tests as clean.

Also note that there is a bit of a confusion as the currently released version 
(poweRlaw 0.80.0) does not get flagged, so it is only the archived version 
(from 2020).

Cheers,
Simon



> On 26/01/2024, at 12:02 AM, Iñaki Ucar  wrote:
> 
> On Thu, 25 Jan 2024 at 10:13, Colin Gillespie  wrote:
>> 
>> Hi All,
>> 
>> I've had two emails from users in the last 24 hours about malware
>> around one of my vignettes. A snippet from the last user is:
>> 
>> ---
>> I was trying to install a R package that depends on PowerRLaw two
>> weeks ago.  However my virus protection software F secure did not
>> allow me to install it from CRAN, while installation from GitHub
>> worked normally. Virus protection software claimed that
>> d_jss_paper.pdf is compromised. I asked about this from our IT support
>> and they asked it from the company F secure. Now F secure has analysed
>> the file and according them it is malware.
>> 
>> “Upon analyzing, our analysis indicates that the file you submitted is
>> malicious. Hence the verdict will remain
> 
> See 
> https://www.virustotal.com/gui/file/9486d99c1c1f2d1b06f0b6c5d27c54d4f6e39d69a91d7fad845f323b0ab88de9/behavior
> 
> According to the sandboxed analysis, there's something there trying to
> tamper with the Acrobat installation. It tries several Windows paths.
> That's not good.
> 
> The good news is that, if I recreate the vignette from your repo, the
> file is different, different hash, and it's clean.
> 
> The bad news is that... this means that CRAN may be compromised. I
> urge CRAN maintainers to check all the PDF vignettes and scan the
> Windows machines for viruses.
> 
> Best,
> Iñaki
> 
> 
>> 
>> ---
>> 
>> Other information is:
>> 
>> * Package in question:
>> https://cran.r-project.org/web/packages/poweRlaw/index.html
>> * Package hasn't been updated for three years
>> * Vignette in question:
>> https://cran.r-project.org/web/packages/poweRlaw/vignettes/d_jss_paper.pdf
>> 
>> CRAN asked me to fix
>> https://cran.r-project.org/web/checks/check_results_poweRlaw.html a
>> couple of days ago - which I'm in the process of doing.
>> 
>> Any ideas?
>> 
>> Thanks
>> Colin
>> 
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 
> 
> 
> -- 
> Iñaki Úcar
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Possible malware(?) in a vignette

2024-01-27 Thread Simon Urbanek
Bob,

I was not making assertions, I was only dismissing clearly false claims: CRAN 
did NOT generate the file in question, it is not a ZIP file trojan as indicated 
by the AV flags and content inspection did not reveal any other streams than 
what is usual in pdflatex output. The information about the alleged malware was 
terribly vague and incomplete to put it mildly so if you have any additional 
forensic information that sheds more light on whether this was a malware or 
not, it would be welcome. If it was indeed one, knowing what kind would help to 
see how any other instances could be detected. Please contact the CRAN team if 
you have any such information and we can take it from there.

As you hinted yourself - there is no such thing as absolute safety - as the 
webp exploits have illustrated very clearly a simple image can be malware and 
the only read defense is to keep your software up to date.

Cheers,
Simon



> On Jan 27, 2024, at 9:52 PM, Bob Rudis  wrote:
> 
> The current one on CRAN does get flagged for some low-level Sigma rules b/c 
> of one of way a few URLs interact. I don't know if f-secure is pedantic 
> enough to call that malicious (it probably is, though). The *current* PDF is 
> "fine".
> 
> There is a major problem with the 2020 version. The file Iñaki's URL matches 
> the PDF that I grabbed from the Wayback Machine for the 2020 PDF from that 
> URL.
> 
> Simon's assertion about this *2020* file is flat out wrong. It's very bad.
> 
> Two VT sandboxes used Adobe Acrobat Reader to open the PDF and the PDF seems 
> to either had malicious JavaScript or had been crafted sufficiently to caused 
> a buffer overflow in Reader that then let it perform other functions on those 
> sandboxes.
> 
> They are most certainly *not* false positives, and dismissing that outright 
> is not great.
> 
> I'm not going to check every 2020 PDF from CRAN, but this is a big signal to 
> me there was an issue *somewhere* in that time period.
> 
> I do not know what cran.r-project.org resolved to for the Common Crawl at 
> that date (which is where archive.org picked it up to archive for the 2020 
> PDF version). I highly doubt the Common Crawl DNS resolution process was 
> spoofed _just for that PDF URL_, but it may have been for CRAN in general or 
> just "in general" during that crawl period.
> 
> It is also possible some malware hit CRAN during portions of that time period 
> and infected more than one PDF.
> 
> But, outright suggesting there is no issue was not the way to go, here. And, 
> someone should likely at least poke at more 2020 PDFs from CRAN vignette 
> builds (perhaps just the ones built that were JSS articles…it's possible the 
> header image sourced at that time was tampered with during some time window, 
> since image decoding issues have plagued Adobe Reader in buffer overflow land 
> for a long while).
> 
> - boB
> 
> 
> On Thu, Jan 25, 2024 at 9:44 PM Simon Urbanek  
> wrote:
> Iñaki,
> 
> I think you got it backwards in your conclusions: CRAN has not generated that 
> PDF file (and Windows machines are not even involved here), it is the 
> contents of a contributed package, so CRAN itself is not compromised. Also it 
> is far from clear that it is really a malware - in fact it's certainly NOT 
> what the website you linked claims as those tags imply trojans disguising 
> ZIPped executables as PDF, but the file is an actual valid PDF and not even 
> remotely a ZIP file (in fact is it consistent with pdflatex output). I looked 
> at the decompressed payload of the PDF and the only binary payload are 
> embedded fonts so my guess would be that some byte sequence in the fonts gets 
> detected as false-positive trojan, but since there is no detail on the report 
> we can just guess. False-positives are a common problem and this would not be 
> the first one. Further indication that it's a false-positive is that a simple 
> re-packaging the streams (i.e. NOT changing the actual PDF contents) make the 
> same file pass the tests as clean.
> 
> Also note that there is a bit of a confusion as the currently released 
> version (poweRlaw 0.80.0) does not get flagged, so it is only the archived 
> version (from 2020).
> 
> Cheers,
> Simon
> 
> 
> 
> > On 26/01/2024, at 12:02 AM, Iñaki Ucar  wrote:
> > 
> > On Thu, 25 Jan 2024 at 10:13, Colin Gillespie  wrote:
> >> 
> >> Hi All,
> >> 
> >> I've had two emails from users in the last 24 hours about malware
> >> around one of my vignettes. A snippet from the last user is:
> >> 
> >> ---
> >> I was trying to install a R package that depends on PowerRLaw two
> >> weeks ago.

Re: [R-pkg-devel] Possible malware(?) in a vignette

2024-01-27 Thread Simon Urbanek
Iñaki,

> On Jan 27, 2024, at 11:44 PM, Iñaki Ucar  wrote:
> 
> Simon,
> 
> Please re-read my email. I did *not* say that CRAN *generated* that file. I 
> said that CRAN *may* be compromised (some virus may have modified files).
> 


I guess I should have been more clear in my response: the file could not have 
been modified by CRAN, because the package files are checksummed (the hashes 
match) so that's how we know this could not have been a virus on the CRAN 
machine.


> I did *not* claim that the report was necessarily 100% accurate. But "that 
> page I linked" was created by a security firm, and it would be wise to 
> further investigate any potential threat reported there, which is what I was 
> suggesting.
> 


I appreciate the report, there was no objection to that. Unfortunately, the 
report has turned out to have virtually no useful information that would make 
it possible for us to investigate. The little information it provided has 
proven to be false (at least as much as could be gleamed from the tags), so 
unless we can get some real security expert to give us more details, there is 
not much more we can do given that the file is no longer distributed. And 
without more detailed information of the threat it's hard to see if there are 
any steps we could take. 

Back to my main original point - as far as CRAN machines are concerned, we did 
check the integrity of the files, machines and tools and found no link there. 
Hence the only path left is to get more details on the particular file to see 
if it is indeed a malware and if so, if it was just some random infection at 
the source or something bigger like Bob hinted at some compromised material 
that may have been circling in the community.

Cheers,
Simon



> I don't think these are "false claims".
> 
> Iñaki
> 
> El sáb., 27 ene. 2024 11:19, Simon Urbanek  <mailto:simon.urba...@r-project.org>> escribió:
> Bob,
> 
> I was not making assertions, I was only dismissing clearly false claims: CRAN 
> did NOT generate the file in question, it is not a ZIP file trojan as 
> indicated by the AV flags and content inspection did not reveal any other 
> streams than what is usual in pdflatex output. The information about the 
> alleged malware was terribly vague and incomplete to put it mildly so if you 
> have any additional forensic information that sheds more light on whether 
> this was a malware or not, it would be welcome. If it was indeed one, knowing 
> what kind would help to see how any other instances could be detected. Please 
> contact the CRAN team if you have any such information and we can take it 
> from there.
> 
> As you hinted yourself - there is no such thing as absolute safety - as the 
> webp exploits have illustrated very clearly a simple image can be malware and 
> the only read defense is to keep your software up to date.
> 
> Cheers,
> Simon
> 
> 
> 
> > On Jan 27, 2024, at 9:52 PM, Bob Rudis mailto:b...@rud.is>> 
> > wrote:
> > 
> > The current one on CRAN does get flagged for some low-level Sigma rules b/c 
> > of one of way a few URLs interact. I don't know if f-secure is pedantic 
> > enough to call that malicious (it probably is, though). The *current* PDF 
> > is "fine".
> > 
> > There is a major problem with the 2020 version. The file Iñaki's URL 
> > matches the PDF that I grabbed from the Wayback Machine for the 2020 PDF 
> > from that URL.
> > 
> > Simon's assertion about this *2020* file is flat out wrong. It's very bad.
> > 
> > Two VT sandboxes used Adobe Acrobat Reader to open the PDF and the PDF 
> > seems to either had malicious JavaScript or had been crafted sufficiently 
> > to caused a buffer overflow in Reader that then let it perform other 
> > functions on those sandboxes.
> > 
> > They are most certainly *not* false positives, and dismissing that outright 
> > is not great.
> > 
> > I'm not going to check every 2020 PDF from CRAN, but this is a big signal 
> > to me there was an issue *somewhere* in that time period.
> > 
> > I do not know what cran.r-project.org <http://cran.r-project.org/> resolved 
> > to for the Common Crawl at that date (which is where archive.org 
> > <http://archive.org/> picked it up to archive for the 2020 PDF version). I 
> > highly doubt the Common Crawl DNS resolution process was spoofed _just for 
> > that PDF URL_, but it may have been for CRAN in general or just "in 
> > general" during that crawl period.
> > 
> > It is also possible some malware hit CRAN during portions of that time 
> > period and infected more than one PDF.
> > 
> > 

Re: [R-pkg-devel] Possible malware(?) in a vignette

2024-01-27 Thread Simon Urbanek
First, let's take a step back, because I think there is way too much confusion 
here.

The original report was about the vignette from the poweRlaw package version 
0.70.6. That package contains a vignette file d_jss_paper.pdf with the SHA256 
hash 9486d99c1c1f2d1b06f0b6c5d27c54d4f6e39d69a91d7fad845f323b0ab88de9 (md5 
e0439db551e1d34e9bf8713fca27887b). This is the same file that would be 
available for download from the web view until the new version was published. 
However, I assume we are talking about the same file based on the fact that 
Iñaki's VirusTotal URL has exactly the same hash, i.e., web view and the 
package are identical (I also checked the other hashes just to be really sure). 
That's why I think we're barking up the wrong tree here since this is not about 
cache poisoning, file swaps or anything like that - the file has never been 
modified - it is the same file that has been submitted to CRAN in 2020.

That's why I was saying that this most likely has nothing to do with CRAN at 
all, but rather the question is if that old file has included some malware for 
the last 4 years or if simply the AV software is misclassifying due to a 
false-positive detection. I'm not a security expert, but based on the little 
information available and inspection of the streams I came to the conclusion 
that it's likely a false-positive. The main reason that made me think so was 
that submitting the exact same *identical* PDF payload with just one-byte 
change to the /ID (which is functionally not used by Acrobat) results in the 
file NOT being flagged as malicious by VirusTotal by any of the security 
vendors. That said, I'm not a security expert, so I may be wrong or I'm missing 
something, that's why I was asking for someone with more expertise to actually 
look at the file as opposed to just trusting auto-generated reports that may be 
wrong. But that is not beyond my power.

(Also if it turns out that the file did contain malware, it would be good to 
know what we can do - for example, nowadays we are re-compressing streams 
and/or filtering through GS so one could imagine that it could be also 
effective at removing PDF malware - if it is real.)

More responses inline.


> On Jan 28, 2024, at 1:10 AM, Bob Rudis  wrote:
> 
> Simon: Is there a historical record of the hashes of just the PDFs
> that show up in the CRAN web view?
> 

Not the website, but hashes are recorded in the packages - so you can verify 
that the file has not changed for years (I can directly confirm it has not 
changed as far back as May 2021).


> Ivan: do you know what mirror NOAA used at that time to get that version of
> the package? Or, did they pull it "directly" from cran.r-project.org
> (scare-quotes only b/c DNS spoofing is and has been a pretty solid attack
> vector)?
> 
> I've asked the infosec community if anyone has VT Enterprise to do a
> historical search on any PDFs that come directly from cran.r-project.org (I
> don't have VT Enterprise). It is possible there are other PDFs from that
> timeframe with similar issues (again, not saying CRAN had any issues; this
> could still be crawler cache poisoning).
> 
> I don't know if any university folks have grad student labor to harness,
> but having a few of them do some archive.org searches for other PDFs in
> that timeframe, and note the source of the archive (likely Common Crawl) if
> there are other real issues, that'd be a solid path forward for triage.
> 
> The fact that the current PDF on CRAN — which uses some of the same
> 7-year-old PDF & JPEG images from —
> https://github.com/csgillespie/poweRlaw/tree/main/vignettes — is not being
> flagged, means it's likely not an issue with Colin's sources.
> 
> Simon: it might be a good idea for all *.r-project.org sites to set up CAA
> records (
> https://en.wikipedia.org/wiki/DNS_Certification_Authority_Authorization)
> since that could help prevent adjacent TLS spoofing.
> 
> Also having something running — https://github.com/SSLMate/certspotter —
> can let y'all know if certs are created for *.r-project.org domains. That
> won't help for well-resourced attacks, but it does add some layers that may
> give a heads-up for any mid-grade spoofing attacks.
> 


All well meant, but remember that CRAN is mirrored worldwide, we have control 
pretty much only over the WU master. That said, we can have a look, but DNS 
changes are not as easy as you would think.

Cheers,
Simon

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Clarifying CRAN's external libraries policy

2024-01-29 Thread Simon Urbanek
Neal,

generally, binaries are not allowed since CRAN cannot check the provenance so 
it's not worth the risk, and it's close to impossible to maintain them over 
time across different systems, toolchains and architectures as they evolve. 
Historically, some packages allowed to provide binaries (e.g., back when the 
Windows toolchain was not as complete and there was only Win32 target it was 
more common to supply a Windows binary) and CRAN was more lenient, but it 
should be avoided nowadays as it was simply too fragile.

As Andrew pointed out in special circumstances you can use external 
hash-checked *source* tar balls, but generally you should provide sources in 
the package.

I do not see any e-mail from you to c...@r-project.org about this, so please 
make sure you are using the correct e-mail if you intend to plead your case.

Cheers,
Simon



> On Jan 30, 2024, at 3:11 AM, Neal Richardson  
> wrote:
> 
> Hi,
> CRAN's policy on using external C/C++/Fortran/other libraries says:
> 
> "Where a package wishes to make use of a library not written solely for the
> package, the package installation should first look to see if it is already
> installed and if so is of a suitable version. In case not, it is desirable
> to include the library sources in the package and compile them as part of
> package installation. If the sources are too large, it is acceptable to
> download them as part of installation, but do ensure that the download is
> of a fixed version rather than the latest. Only as a last resort and with
> the agreement of the CRAN team should a package download pre-compiled
> software."
> 
> Apologies if this is documented somewhere I've missed, but how does one get
> CRAN's agreement to download pre-compiled software? A project I work with
> has been seeking permission since October, but emails to both
> c...@r-project.org and cran-submissi...@r-project.org about this have not
> been acknowledged.
> 
> I recognize that this mailing list is not CRAN, but I was hoping someone
> here might know the right way to reach the CRAN team to provide a judgment
> on such a request.
> 
> Thank you,
> Neal
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Clarifying CRAN's external libraries policy

2024-01-29 Thread Simon Urbanek
Nic,

as far as I can see that thread was clearly concluded that it is not a special 
case that would require external binary downloads.

Cheers,
Simon


> On Jan 30, 2024, at 11:11 AM, Nic Crane  wrote:
> 
> Hi Simon,
> 
> The email that Neal is referring to was sent by me (this email
> address) to c...@r-project.org on Mon, 23 Oct 2023.
> 
> Thanks,
> 
> Nic
> 
> 
> On Mon, 29 Jan 2024 at 18:51, Simon Urbanek  
> wrote:
>> 
>> Neal,
>> 
>> generally, binaries are not allowed since CRAN cannot check the provenance 
>> so it's not worth the risk, and it's close to impossible to maintain them 
>> over time across different systems, toolchains and architectures as they 
>> evolve. Historically, some packages allowed to provide binaries (e.g., back 
>> when the Windows toolchain was not as complete and there was only Win32 
>> target it was more common to supply a Windows binary) and CRAN was more 
>> lenient, but it should be avoided nowadays as it was simply too fragile.
>> 
>> As Andrew pointed out in special circumstances you can use external 
>> hash-checked *source* tar balls, but generally you should provide sources in 
>> the package.
>> 
>> I do not see any e-mail from you to c...@r-project.org about this, so please 
>> make sure you are using the correct e-mail if you intend to plead your case.
>> 
>> Cheers,
>> Simon
>> 
>> 
>> 
>>> On Jan 30, 2024, at 3:11 AM, Neal Richardson  
>>> wrote:
>>> 
>>> Hi,
>>> CRAN's policy on using external C/C++/Fortran/other libraries says:
>>> 
>>> "Where a package wishes to make use of a library not written solely for the
>>> package, the package installation should first look to see if it is already
>>> installed and if so is of a suitable version. In case not, it is desirable
>>> to include the library sources in the package and compile them as part of
>>> package installation. If the sources are too large, it is acceptable to
>>> download them as part of installation, but do ensure that the download is
>>> of a fixed version rather than the latest. Only as a last resort and with
>>> the agreement of the CRAN team should a package download pre-compiled
>>> software."
>>> 
>>> Apologies if this is documented somewhere I've missed, but how does one get
>>> CRAN's agreement to download pre-compiled software? A project I work with
>>> has been seeking permission since October, but emails to both
>>> c...@r-project.org and cran-submissi...@r-project.org about this have not
>>> been acknowledged.
>>> 
>>> I recognize that this mailing list is not CRAN, but I was hoping someone
>>> here might know the right way to reach the CRAN team to provide a judgment
>>> on such a request.
>>> 
>>> Thank you,
>>> Neal
>>> 
>>>  [[alternative HTML version deleted]]
>>> 
>>> __
>>> R-package-devel@r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>>> 
>> 
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Warnings from upstream C library in CRAN submission

2024-02-03 Thread Simon Urbanek
Satyaprakash,

those are clear bugs in the SUNDIALS library - they assume that "unsigned long" 
type is 64-bit wide (that assumption is also mentioned in the comments), but 
there is no such guarantee and on Windows it is only 32-bit wide, so the code 
has to be changed to replace "unsigned long" with the proper unsigned 64-bit 
type which is uint64_t. The code is simply wrong and won't work unless those 
issues are solved, so those are not just warnings but actual errors. It would 
be also prudent to check the rest of the code in the library for similar 
incorrect use of the "long" type where 64-bit use was intended.

Cheers,
Simon


> On Feb 4, 2024, at 4:38 AM, Satyaprakash Nayak  wrote:
> 
> Hi
> 
> I had a package 'sundialr' which was archived from CRAN. It is an interface
> to some of the solvers in SUNDIALS ODE Solving library. I have fixed the
> issue which was related to emails being forwarded from the maintainer's
> email address.
> 
> The repository code can be found at - https://github.com/sn248/sundialr
> 
> I have updated the upstream library and now I am getting the following
> warnings from CRAN which are all related to the upstream library. The
> package compiles without any other issues and can be used.
> 
> Flavor: r-devel-windows-x86_64
> Check: whether package can be installed, Result: WARNING
>  Found the following significant warnings:
>./sundials/sundials/sundials_hashmap.h:26:48: warning: conversion from
> 'long long unsigned int' to 'long unsigned int' changes value from
> '14695981039346656037' to '2216829733' [-Woverflow]
>./sundials/sundials/sundials_hashmap.h:27:48: warning: conversion from
> 'long long unsigned int' to 'long unsigned int' changes value from
> '1099511628211' to '435' [-Woverflow]
>sundials/sundials/sundials_hashmap.h:26:48: warning: conversion from
> 'long long unsigned int' to 'long unsigned int' changes value from
> '14695981039346656037' to '2216829733' [-Woverflow]
>sundials/sundials/sundials_hashmap.h:27:48: warning: conversion from
> 'long long unsigned int' to 'long unsigned int' changes value from
> '1099511628211' to '435' [-Woverflow]
>sundials/sundials/sundials_profiler.c:71:24: warning: function
> declaration isn't a prototype [-Wstrict-prototypes]
>  See 'd:/RCompile/CRANincoming/R-devel/sundialr.Rcheck/00install.out' for
> details.
>  Used C++ compiler: 'g++.exe (GCC) 12.3.0'
> 
> Flavor: r-devel-linux-x86_64-debian-gcc
> Check: whether package can be installed, Result: WARNING
>  Found the following significant warnings:
>sundials/sundials/sundials_profiler.c:71:41: warning: a function
> declaration without a prototype is deprecated in all versions of C
> [-Wstrict-prototypes]
>  See '/srv/hornik/tmp/CRAN/sundialr.Rcheck/00install.out' for details.
>  Used C++ compiler: 'Debian clang version 17.0.6 (5)'
> 
> I am hesitant to change anything in the SUNDIALS library C code because I
> don't understand the consequences of changing anything there.
> 
> Any help will be kindly appreciated.
> 
> Thank you.
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] CRAN uses an old version of clang

2024-02-11 Thread Simon Urbanek
Just to include the necessary details: macOS CRAN build uses Apple clang-14, so 
you cannot assume anything higher. Also the target is macOS 11 SDK.

That said, LLVM does not support the special math functions at all according to 
the status report (see Mathematical Special Functions for C++17 at 
https://libcxx.llvm.org/Status/Cxx17.html) so Boost is your best bet.

BTW: this is not a Mac thing - you can replicate it on any other system, eg. in 
Linux:

$ clang++-17 -std=c++17 -stdlib=libc++  bes.cc
bes.cc:11:49: error: no member named 'cyl_bessel_k' in namespace 'std'
   11 | std::cout << "K_.5(" << x << ") = " << std::cyl_bessel_k(.5, x) << 
'\n'
  |~^
bes.cc:13:35: error: no member named 'cyl_bessel_i' in namespace 'std'
   13 |   << (pi / 2) * (std::cyl_bessel_i(-.5, x)
  |  ~^
bes.cc:14:25: error: no member named 'cyl_bessel_i' in namespace 'std'
   14 |  - std::cyl_bessel_i(.5, x)) / std::sin(.5 * pi) << 
'\n';
  |~^
3 errors generated.

Cheers,
Simon


> On 10/02/2024, at 8:04 AM, Marcin Jurek  wrote:
> 
> All this makes sense, thanks for your tips, everyone!
> 
> Marcin
> 
> On Fri, Feb 9, 2024 at 9:44 AM Dirk Eddelbuettel  wrote:
> 
>> 
>> On 9 February 2024 at 08:59, Marcin Jurek wrote:
>> | I recently submitted an update to my package. It previous version relied
>> on
>> | Boost for Bessel and gamma functions but a colleague pointed out to me
>> that
>> | they are included in the standard library beginning with the C++17
>> | standard.
>> 
>> There is an often overlooked bit of 'fine print': _compiler support_ for a
>> C++ standard is not the same as the _compiler shipping a complete library_
>> for that same standard. This can be frustrating. See the release notes for
>> gcc/g++ and clang/clang++, IIRC they usually have a separate entry for C++
>> library support.
>> 
>> In this case, can probably rely on LinkingTo: BH which has been helping
>> with
>> Boost headers for over a decade.
>> 
>> Writing R Extensions is also generally careful in reminding us that such
>> language standard support is always dependent on the compiler at hand. So
>> package authors ought to check, just like R does via its extensive
>> configure
>> script when it builds.
>> 
>> Dirk
>> 
>> --
>> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
>> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Package required but not available: ‘arrow’

2024-02-25 Thread Simon Urbanek
To quote Rob: "Version numbers are cheap"

The way the policy is worded it is clear that you cannot complain if you didn't 
increase it as you are taking a risk. Also the the incoming FTP won't let you 
upload same version twice so it wasn't really a problem until more recently 
when there are multiple different ways to submit. Either way, changing the 
policy to MUST is probably the best way to avoid race conditions and certainly 
the only good practice.

Cheers,
Simon


> On 25/02/2024, at 5:44 PM, Rolf Turner  wrote:
> 
> 
> On Fri, 23 Feb 2024 10:19:41 -0600
> Dirk Eddelbuettel  wrote:
> 
>> 
>> On 23 February 2024 at 15:53, Leo Mada wrote:
>> | Dear Dirk & R-Members,
>> | 
>> | It seems that the version number is not incremented:
>> | # Archived
>> | arrow_14.0.2.1.tar.gz   2024-02-08 11:57  3.9M
>> | # Pending
>> | arrow_14.0.2.1.tar.gz   2024-02-08 18:24  3.9M
>> | 
>> | Maybe this is the reason why it got stuck in "pending".
>> 
>> No it is not.
>> 
>> The hint to increase version numbers on re-submission is a weaker
>> 'should' or 'might', not a strong 'must'.
>> 
>> I have uploaded a few packages to CRAN over the last two decades, and
>> like others have made mistakes requiring iterations. I have not once
>> increased a version number.
> 
> That's as may be but IMHO (and AFAICS) it never hurts to increment the
> version number, even if you've only corrected a trivial glitch.
> 
> cheers,
> 
> Rolf
> 
> -- 
> Honorary Research Fellow
> Department of Statistics
> University of Auckland
> Stats. Dep't. (secretaries) phone:
> +64-9-373-7599 ext. 89622
> Home phone: +64-9-480-4619
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Problem with loading package "devtools" from CRAN.

2024-04-28 Thread Simon Urbanek
Rolf,

what do you mean by "broken"? Since you failed to include any proof nor details 
it's unlikely that anyone can help you, but chances are pretty high that it was 
a problem on your end. I just checked with R 4.4.0 on Ubuntu 22.04 and devtools 
install and load just fine, so it is certainly broken on CRAN.

Make sure you don't have packages for old R version in your local libraries - 
that is a most common mistake - always remove them when upgrading R and 
re-install if still need them. You can check the locations of your libraries 
with .libPaths() in R. Sometimes, update.packages(checkBuilt=TRUE) can do the 
trick as well, but I prefer clean re-installs for safety as it also helps you 
clean up old cruft that is not longer needed.

Cheers,
Simon



> On Apr 29, 2024, at 1:19 PM, Rolf Turner  wrote:
> 
> 
> Executive summary:
> 
>> The devtools package on CRAN appears to be broken.
>> Installing devtools from github (using remotes::install_github())
>> seems to give satisfactory results.
> 
> This morning my software up-dater (Ubuntu 22.04.4) prompted me to
> install updated versions of some software, including r-base.  I thereby
> obtained what I believe is the latest version of R (4.4.0 (2024-04-24)).
> 
> Then I could not load the "devtools" package, which is fairly crucial to
> my work.
> 
> A bit of web-searching got me to a post on github by Henrik Bengtsson,
> which referred to the devtools problem.  I therefore decided to try
> installing devtools from github:
> 
>remotes::install_github("r-lib/devtools",lib="/home/rolf/Rlib")
> 
> Some 50-odd packages seemed to require up-dating.  I went for it, and
> after a fairly long wait, while messages about the updating flowed by,
> devtools seemed to get installed.  Now "library(devtools)" runs without
> error, so I am happy with my own situation.  However there seems to be
> a problem with the devtools package on CRAN, which ought to be fixed.
> 
> cheers,
> 
> Rolf Turner
> 
> -- 
> Honorary Research Fellow
> Department of Statistics
> University of Auckland
> Stats. Dep't. (secretaries) phone:
> +64-9-373-7599 ext. 89622
> Home phone: +64-9-480-4619
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] re-exporting plot method?

2024-04-30 Thread Simon Urbanek
Kevin,

welcome to the S4 world! ;) There is really no good solution since S4 only 
works at all if you attach the package since it relies on replacing the base S3 
generic with its own - so the question remains what are your options to do it.

The most obvious is to simply add Depends: Rgraphviz which makes sure that any 
generics required are attached so your package doesn't need to worry. This is 
the most reliable in a way as you are not limiting the functionality to methods 
you know about. The side-effect, though, (beside exposing functions the user 
may not care about) is that such package cannot be on CRAN since Rgraphics is 
not on CRAN (that said, you mentioned you are already importing then you seem 
not to worry about that).

The next option is to simply ignore Rgraphviz and instead add 
setGeneric("plot") to your package and add methods to Depends and 
importFrom(methods, setGeneric) + exportMethods(plot) to the namespace. This 
allows you to forget about any dependencies - you are just creating the S4 
generic from base::plot to make the dispatch work. This is the most 
light-weight solution as you only cherry-pick methods you need and there are no 
dependencies other than "methods". However, it is limited to just the functions 
you care about.

Finally, you could re-export the S4 plot generic from Rgraphviz, but I'd say 
that is the least sensible option, since it doesn't have any benefit over doing 
it yourself and only adds a hard dependency for no good reason. Also copying 
functions from another package opens up a can of worms with versions etc. - 
even if the risk is likely minimal.

Just for completeness - a really sneaky way would be to export an S3 plot 
method from your package - it would be only called in the case where the plot 
generic has not been replaced yet, so you could "fix" things on the fly by 
calling the generic from Rgraphviz, but that sounds a little hacky even for my 
taste ;).

Cheers,
Simon



> On 1/05/2024, at 6:03 AM, Kevin R. Coombes  wrote:
> 
> Hi,
> 
> I am working on a new package that primarily makes use of "igraph" 
> representations of certain mathematical graphs, in order to apply lots of the 
> comp sci algorithms already implemented in that package. For display 
> purposes, my "igraph" objects already include information that defines node 
> shapes and colors and edge styles and colors. But, I believe that the "graph" 
> - "Rgraphviz" - "graphNEL" set of tools will produce better plots of the 
> graphs.
> 
> So, I wrote my own "as.graphNEL" function that converts the "igraph" objects 
> I want to use into graphNEL (or maybe into "Ragraph") format in order to be 
> able to use their graph layout and rendering routines. This function is smart 
> enough to translate the node and edge attributes from igraph into something 
> that works correctly when plotted using the tools in Rgraphviz. (My 
> DESCRIPTION and NAMESPACE files already import the set of functions from 
> Rgraphviz necessary to make this happen.)
> 
> In principle, I'd like the eventual user to simply do something like
> 
> library("KevinsNewPackage")
> IG <- makeIgraphFromFile(sourcefile)
> GN <- as.graphNEL(IG)
> plot(GN)
> 
> The first three lines work fine, but the "plot" function only works if the 
> user also explicitly includes the line
> 
> library("Rgraphviz")
> 
> I suspect that there is a way with imports and exports in the NAMESPACE to 
> make this work without having to remind the user to load the other package. 
> But (in part because the plot function in Rgraphviz is actually an S4 method, 
> which I don't need to alter in any way), I'm not sure exactly what needs to 
> be imported or exported.
> 
> Helpful suggestion would be greatly appreciated.
> 
> Best,
>   Kevin
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Error handling in C code

2024-05-05 Thread Simon Urbanek
Jarrod,

could you point us to the code? There is not much to go by based on your email. 
One thing just in general: it's always safer to not re-map function names, 
especially since "error" can be defined in many random other headers, so it's 
better to use Rf_error() instead to avoid confusions with 3rd party headers 
that may (re-)define the "error" macro (depending on the order you include them 
in).

Cheers,
Simon


> On 4/05/2024, at 3:17 AM, Jarrod Hadfield  wrote:
> 
> Hi,
> 
> I have an R library with C code in it. It has failed the CRAN checks for 
> Debian.  The problem is with the error function being undefined. Section 6.2 
> of the Writing R extensions (see below) suggests error handling can be 
> handled by error and the appropriate header file is included in R.h, but this 
> seems not to be the case?
> 
> Any help would be appreciated!
> 
> Thanks,
> 
> Jarrod
> 
> 6.2 Error signaling
> 
> The basic error signaling routines are the equivalents of stop and warning in 
> R code, and use the same interface.
> 
> void error(const char * format, ...);
> void warning(const char * format, ...);
> void errorcall(SEXP call, const char * format, ...);
> void warningcall(SEXP call, const char * format, ...);
> void warningcall_immediate(SEXP call, const char * format, ...);
> 
> These have the same call sequences as calls to printf, but in the simplest 
> case can be called with a single character string argument giving the error 
> message. (Don�t do this if the string contains �%� or might otherwise be 
> interpreted as a format.)
> 
> These are defined in header R_ext/Error.h included by R.h.
> The University of Edinburgh is a charitable body, registered in Scotland, 
> with registration number SC005336. Is e buidheann carthannais a th� ann an 
> Oilthigh Dh�n �ideann, cl�raichte an Alba, �ireamh cl�raidh SC005336.
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


  1   2   >