Re: [R-pkg-devel] UseR! Session: Navigating the jungle of R packages.

2017-02-10 Thread J C Nash

We'd be more than happy to have you contribute directly. The goal is not just an
information session, but to get some movement to ways to make the package 
collection(s)
easier to use effectively. Note to selves: "effectively" is important -- we 
could make
things easy by only recommending a few packages.

Best, JN


On 2017-02-10 09:29 AM, Michael Dewey wrote:

Dear all

That seems an interesting session. I am the maintainer of one of the CRAN Task 
Views (MetaAnalysis) and will attend
unless I am successful in the draw for Wimbledon tickets.

Just in case I strike lucky one question I would have raised from the floor if I 
were there would have been "Does anyone
read the Task Views?". Since I started mine I have received only a couple of 
suggestions for additions including a very
abrupt one about a package which had been included for months but whose author 
clearly did not read before writing. So I
would ask whether we need to focus much energy on the Task Views.

So, maybe see you there, maybe not.

On 16/01/2017 14:57, ProfJCNash wrote:

Navigating the Jungle of R Packages

The R ecosystem has many packages in various collections,
especially CRAN, Bioconductor, and GitHub. While this
richness of choice speaks to the popularity and
importance of R, the large number of contributed packages
makes it difficult for users to find appropriate tools for
their work.

A session on this subject has been approved for UseR! in
Brussels. The tentative structure is three short
introductory presentations, followed by discussion or
planning work to improve the tools available to help
users find the best R package and function for their needs.

The currently proposed topics are

- wrapper packages that allow diverse tools that perform
  similar functions to be accessed by unified calls

- collaborative mechanisms to create and update Task Views

- search and sort tools to find packages.

At the time of writing we have tentative presenters for
the topics, but welcome others. We hope these presentations
at useR! 2017 will be part of a larger discussion that will
contribute to an increased team effort after the conference
to improve the the support for R users in these areas.


John Nash, Julia Silge, Spencer Graves

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel





__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] UseR! Session: Navigating the jungle of R packages.

2017-02-11 Thread J C Nash
Thanks for this. Besides stirring the pot by suggesting this suggestion, my own approach has been to try to do this with 
optimr/optimrx for "optimization" (actually function minimization with possibly bounds). Hans Werner Borchers has been 
charging ahead with a global optimization wrapper gloptim, but it is early days.


Jonathan: since you are planning to attend, perhaps we can collaborate to include jmv as another example. One issue is 
that there are almost certainly many anova tools, but unlike optimizers where one wants to compare performance, I 
believe anova will be about comparing possibly incompatible features. If interested, get in tough offline.


Best, JN

On 2017-02-10 07:48 PM, Jonathon Love wrote:

hi,

first up let me apologise for breaking the thread. i subscribed to this list 
after the initial email went out.

i'm not completely sure if the original post was to prompt a discussion here, 
but now there's a discussion, i'm jumping in!

i'm a psychologist, and one of the challenges is the number of packages required to do 
what is "standard practice", and
getting them all to work together.

to do an ANOVA (the bread and butter of psych research) with all it's 
assumption checks, contrasts, corrections, etc.
requires in the order of seven packages.

our solution to this is to create an "uber" package, which makes use of all 
these things behind a single function call
(with many arguments), which is what our jmv package is:

https://www.jamovi.org/jmv/

we represent an extreme, we even handle plots, but there are other examples of 
more intermediate solutions: afex, psych,
etc.

i appreciate this is somewhat at odds with (what i perceive to be) the R ethos, 
which is giving people very fine control
over the intermediate parts of one's analysis, but it is another approach to 
making it easier for people to find
appropriate tools for their field.

for me, the key is being "goal-centred", "what is a person in my field trying to 
achieve?" rather than
"analysis-centred"; "this package provides analysis X" ... but i appreciate 
this is likely an unpopular position.

i'll definitely be attending this session at use!R, and happy to espouse more 
unpopular views

cheers

jonathon



Navigating the Jungle of R Packages

The R ecosystem has many packages in various collections,
especially CRAN, Bioconductor, and GitHub. While this
richness of choice speaks to the popularity and
importance of R, the large number of contributed packages
makes it difficult for users to find appropriate tools for
their work.

A session on this subject has been approved for UseR! in
Brussels. The tentative structure is three short
introductory presentations, followed by discussion or
planning work to improve the tools available to help
users find the best R package and function for their needs.

The currently proposed topics are

- wrapper packages that allow diverse tools that perform
  similar functions to be accessed by unified calls

- collaborative mechanisms to create and update Task Views

- search and sort tools to find packages.

At the time of writing we have tentative presenters for
the topics, but welcome others. We hope these presentations
at useR! 2017 will be part of a larger discussion that will
contribute to an increased team effort after the conference
to improve the the support for R users in these areas.


John Nash, Julia Silge, Spencer Graves



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] UseR! Session: Navigating the jungle of R packages.

2017-02-11 Thread J C Nash
Certainly Google can be useful, but it can also be infuriatingly time-wasting when one needs to sort out related tools 
that do slightly different things. Then good, up-to-date task views are important, and wrappers such as I and some 
others are trying to develop can be a way to ease the chore of applying the tools or changing between related ones where 
there isn't enough information on which is best.


Perhaps Jim, Spencer, and I (others welcome!) can come up with some small examples to show where Google / sos / other 
search tools and the task views (Julia?) can be illustrated to provide guidance. After all, the purpose of the UseR! 
session is to try to develop improved ways to access R's packages.


Cheers, John Nash

On 2017-02-10 05:26 PM, Jim Lemon wrote:

This discussion started me thinking about searching for a function or
package, as many questions on the R help list indicate the that poster
couldn't find (or hasn't searched for) what they want. I don't think I
have ever used task views. If I haven't got a clue where to look for
something, I use Google. I can't recall an occasion when I didn't get
an answer, even if it was that what I wanted didn't exist. Perhaps we
should ask why Google is so good at answering uninformed questions, in
particular about R. I'm not the only person on the help list who
advises the clueless to try Google.

Jim


On Sat, Feb 11, 2017 at 3:51 AM, Ben Bolker  wrote:

  I definitely read the task views and advise others to do so.  I
don't know how representative my little corner of the world is,
though.

  I have an embryonic task view on mixed models at
https://github.com/bbolker/mixedmodels-misc/blob/master/MixedModels.ctv
but the perfect is the enemy of the good ...


On Fri, Feb 10, 2017 at 9:56 AM, J C Nash  wrote:

We'd be more than happy to have you contribute directly. The goal is not
just an
information session, but to get some movement to ways to make the package
collection(s)
easier to use effectively. Note to selves: "effectively" is important -- we
could make
things easy by only recommending a few packages.

Best, JN


On 2017-02-10 09:29 AM, Michael Dewey wrote:


Dear all

That seems an interesting session. I am the maintainer of one of the CRAN
Task Views (MetaAnalysis) and will attend
unless I am successful in the draw for Wimbledon tickets.

Just in case I strike lucky one question I would have raised from the
floor if I were there would have been "Does anyone
read the Task Views?". Since I started mine I have received only a couple
of suggestions for additions including a very
abrupt one about a package which had been included for months but whose
author clearly did not read before writing. So I
would ask whether we need to focus much energy on the Task Views.

So, maybe see you there, maybe not.


On 16/01/2017 14:57, ProfJCNash wrote:


Navigating the Jungle of R Packages

The R ecosystem has many packages in various collections,
especially CRAN, Bioconductor, and GitHub. While this
richness of choice speaks to the popularity and
importance of R, the large number of contributed packages
makes it difficult for users to find appropriate tools for
their work.

A session on this subject has been approved for UseR! in
Brussels. The tentative structure is three short
introductory presentations, followed by discussion or
planning work to improve the tools available to help
users find the best R package and function for their needs.

The currently proposed topics are

- wrapper packages that allow diverse tools that perform
  similar functions to be accessed by unified calls

- collaborative mechanisms to create and update Task Views

- search and sort tools to find packages.

At the time of writing we have tentative presenters for
the topics, but welcome others. We hope these presentations
at useR! 2017 will be part of a larger discussion that will
contribute to an increased team effort after the conference
to improve the the support for R users in these areas.


John Nash, Julia Silge, Spencer Graves

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel





__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] UseR! Session: Navigating the jungle of R packages

2017-02-16 Thread J C Nash

As a way to organize discussion for the session AND for the hopefully 
derivative development, I have set up a Github project

https://github.com/nashjc/Rnavpkg

In particular, I've started a small wiki therein, and especially have built a page "Ideas and people" where I have tried 
to summarize succinctly the comments so far. I'm absolutely sure I've missed some contributions and got some ideas bent, 
and ask your indulgence and assistance. I'll be happy to give access to edit/improve things (and freely admit the Github 
process is a bit new to me, being used to svn on R-forge).


Julia has suggested rmdshower for preparing slide presentations (which clearly can be collaboratively developed on 
Rnavpkg). I also note that one can use RStudio File/New/File/Presentation, though I haven't checked how these render to 
other than html.


Cheers, John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Solaris SPARC, Fortran, and logical errors?

2017-03-15 Thread J C Nash

Possibly tangential, but has there been any effort to set up a Sparc testbed? It
seems we could use a network-available (virtual?) machine, since this platform 
is
often the unfortunate one. Unless, of course, there's a sunset date.

For information, I mentioned SPARC at our local linux group, and apparently 
there
are a couple of folk who have them running, but I didn't find out the state of 
the
OS etc.

JN


On 2017-03-15 10:40 AM, Avraham Adler wrote:

Hello.

The Delaporte package works properly on all R-core platforms except
Solaris SPARC, where it  compiles properly but fails a number of its
tests [1]. Not having access to a SPARC testbed, I'm limited in what
kind of diagnostics I can do. One thing I have noticed is that a lot
of the failures occur when I am passing non-default logicals (like
lower tail or log). For example, the first failure at that link is
when "log = true" is supposed to be passed, but the SPARC answers are
the unlogged values. Of the 22 failed tests, 12 of them pass logicals.

I'll bring an example of how it is coded below, and if anyone
recognizes where SPARC specifically goes wrong, I'd appreciate. I
guess, if I absolutely had to, I could convert the logical to an
integer in C and pass the integer to Fortran which should work even
for SPARC, but I'd prefer not to if I could help it.

Thank you,

Avi

[1] https://cran.r-project.org/web/checks/check_results_Delaporte.html

*Example Code*

R code:

ddelap <- function(x, alpha, beta, lambda, log = FALSE){
  if(!is.double(x)) {storage.mode(x) <- 'double'}
  if(!is.double(alpha)) {storage.mode(alpha) <- 'double'}
  if(!is.double(beta)) {storage.mode(beta) <- 'double'}
  if(!is.double(lambda)) {storage.mode(lambda) <- 'double'}
  if(any(x > floor(x))) {
warning("Non-integers passed to ddelap. These will have 0 probability.")
  }
  .Call(ddelap_C, x, alpha, beta, lambda, log)
}

C code:

void ddelap_f(double *x, int nx, double *a, int na, double *b, int nb,
double *l, int nl,
  int *lg, double *ret);

extern SEXP ddelap_C(SEXP x, SEXP alpha, SEXP beta, SEXP lambda, SEXP lg){
  const int nx = LENGTH(x);
  const int na = LENGTH(alpha);
  const int nb = LENGTH(beta);
  const int nl = LENGTH(lambda);
  SEXP ret;
  PROTECT(ret = allocVector(REALSXP, nx));
  ddelap_f(REAL(x), nx, REAL(alpha), na, REAL(beta), nb, REAL(lambda),
nl, LOGICAL(lg), REAL(ret));
  UNPROTECT(1);
  return(ret);
}

Fortran: (not posting ddelap_f_s as that doesn't handle the logging)

subroutine ddelap_f(x, nx, a, na, b, nb, l, nl, lg, pmfv) bind(C,
name="ddelap_f")

integer(kind = c_int), intent(in), value :: nx, na, nb, nl
! Sizes
real(kind = c_double), intent(in), dimension(nx) :: x
! Observations
real(kind = c_double), intent(out), dimension(nx):: pmfv
! Result
real(kind = c_double), intent(in):: a(na), b(nb),
l(nl)! Parameters
logical(kind = c_bool), intent(in)   :: lg
! Log flag
integer  :: i
! Integer

!$omp parallel do default(shared) private(i)
do i = 1, nx
if (x(i) > floor(x(i))) then
pmfv(i) = ZERO
else
pmfv(i) = ddelap_f_s(x(i), a(mod(i - 1, na) + 1), &
 b(mod(i - 1, nb) + 1), l(mod(i -
1, nl) + 1))
end if
end do
!$omp end parallel do

if (lg) then
pmfv = log(pmfv)
end if

end subroutine ddelap_f

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Solaris SPARC, Fortran, and logical errors?

2017-03-16 Thread J C Nash

FWIW it appears that QEMU has an admittedly slow implementation that supports
some architectures beyond x86/amd64 and that there is recent activity. See

http://wiki.qemu-project.org/Documentation/Platforms/SPARC

An alternative might be to persuade Oracle to provide a Sparc-builder, since 
they
advertise Oracle R Technologies at
http://www.oracle.com/technetwork/database/database-technologies/r/r-technologies/r-offerings-1566363.html

but dates on that page are from 2014. Perhaps someone has contacts at Oracle 
and could at least raise
the possibility.

JN



On 2017-03-16 08:20 AM, Ben Bolker wrote:

I completely agree that testing on SPARC Solaris is valuable, however
much of a nuisance it is.  But I also agree that it would be great if
we could find a way to provide a publicly accessible SPARC Solaris
testing framework.

On Thu, Mar 16, 2017 at 6:49 AM, Uwe Ligges
 wrote:



On 15.03.2017 18:30, Ben Bolker wrote:




On 17-03-15 11:09 AM, J C Nash wrote:


Possibly tangential, but has there been any effort to set up a Sparc
testbed? It
seems we could use a network-available (virtual?) machine, since this
platform is
often the unfortunate one. Unless, of course, there's a sunset date.

For information, I mentioned SPARC at our local linux group, and
apparently there
are a couple of folk who have them running, but I didn't find out the
state of the
OS etc.

JN



  The virtual machine platforms I know of (admittedly not a complete
list!) only support Solaris on x86, e.g.



Yes, you cannot emulate a Sparc in an efficient way on an amd64 platform.

I take the opportunity to repeat why testing on *Sparc Solaris* gives many
benefits:

- this way we cover big- and little-endian platforms (i.e. for future
stability so that it works on what appear to be still esoteric such as ARM
based architectures or so)
- we cover one of the commercial unixes, i.e. we see
  + how stuff works on the the typically rather old toolchains
  + and what happens in on gnu/gcc-setups and how much GNUisms are used

Best,
Uwe Ligges





https://community.oracle.com/thread/2569292






On 2017-03-15 10:40 AM, Avraham Adler wrote:


Hello.

The Delaporte package works properly on all R-core platforms except
Solaris SPARC, where it  compiles properly but fails a number of its
tests [1]. Not having access to a SPARC testbed, I'm limited in what
kind of diagnostics I can do. One thing I have noticed is that a lot
of the failures occur when I am passing non-default logicals (like
lower tail or log). For example, the first failure at that link is
when "log = true" is supposed to be passed, but the SPARC answers are
the unlogged values. Of the 22 failed tests, 12 of them pass logicals.

I'll bring an example of how it is coded below, and if anyone
recognizes where SPARC specifically goes wrong, I'd appreciate. I
guess, if I absolutely had to, I could convert the logical to an
integer in C and pass the integer to Fortran which should work even
for SPARC, but I'd prefer not to if I could help it.

Thank you,

Avi

[1] https://cran.r-project.org/web/checks/check_results_Delaporte.html

*Example Code*

R code:

ddelap <- function(x, alpha, beta, lambda, log = FALSE){
  if(!is.double(x)) {storage.mode(x) <- 'double'}
  if(!is.double(alpha)) {storage.mode(alpha) <- 'double'}
  if(!is.double(beta)) {storage.mode(beta) <- 'double'}
  if(!is.double(lambda)) {storage.mode(lambda) <- 'double'}
  if(any(x > floor(x))) {
warning("Non-integers passed to ddelap. These will have 0
probability.")
  }
  .Call(ddelap_C, x, alpha, beta, lambda, log)
}

C code:

void ddelap_f(double *x, int nx, double *a, int na, double *b, int nb,
double *l, int nl,
  int *lg, double *ret);

extern SEXP ddelap_C(SEXP x, SEXP alpha, SEXP beta, SEXP lambda, SEXP
lg){
  const int nx = LENGTH(x);
  const int na = LENGTH(alpha);
  const int nb = LENGTH(beta);
  const int nl = LENGTH(lambda);
  SEXP ret;
  PROTECT(ret = allocVector(REALSXP, nx));
  ddelap_f(REAL(x), nx, REAL(alpha), na, REAL(beta), nb, REAL(lambda),
nl, LOGICAL(lg), REAL(ret));
  UNPROTECT(1);
  return(ret);
}

Fortran: (not posting ddelap_f_s as that doesn't handle the logging)

subroutine ddelap_f(x, nx, a, na, b, nb, l, nl, lg, pmfv) bind(C,
name="ddelap_f")

integer(kind = c_int), intent(in), value :: nx, na, nb, nl
! Sizes
real(kind = c_double), intent(in), dimension(nx) :: x
! Observations
real(kind = c_double), intent(out), dimension(nx):: pmfv
! Result
real(kind = c_double), intent(in):: a(na), b(nb),
l(nl)! Parameters
logical(kind = c_bool), intent(in)   :: lg
! Log flag
integer  :: i
! Integer

!$omp parallel do default(shared) private(i)
do i = 1, nx
if (x(i) > floor(x(i))) then
pmfv(i) = ZE

[R-pkg-devel] Advice on dealing with absent dependencies

2017-09-18 Thread J C Nash

Although I have been working with R for quite a while, I still find some issues
of dependencies troubling. In particular, my optimr package is set up to allow
users to access many function minimization tools via a single wrapper. In the 
past
I've had notices from CRAN to fix things when called packages fail. To avoid 
such
messages, the CRAN version has a limited set of solvers, and I've put the full
version (optimrx) on R-forge.

I'm now thinking I should try to adjust optimr so that it works with all the 
solvers
it has available (a user may not want to install them all, for example), and 
does
not give more than a warning when one or more is unavailable.

Can someone point me to best practise for doing this sort of conditional 
operation?
While I've some ideas, I'd prefer to do things right and not give the CRAN
admins any extra work. Assuming I succeed, I'll happily prepare a write-up if 
one
is not available.

Cheers,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] tibbles are not data frames

2017-09-26 Thread J C Nash
Having been around a while and part of several programming language and
other standards (see ISO 6373:1984 and IEEE 754-1985), I prefer some democracy 
at the
level of getting a standard. Though perhaps at the design level I can agree
with Hadley. However, we're now at the stage of needing to clean up R
and actually get rid of some serious annoyances, in which I would include
my own contributions that appear in optim(), namely the Nelder-Mead,
BFGS and CG options for which there are replacements.

In the tibble/data-frame issue, it would appear there could be a resolution
with some decision making at the R-core level, and whether that is democratic
or ad-hoc, it needs to happen.

JN


On 2017-09-26 05:08 PM, Hadley Wickham wrote:

> 
> I'm not sure that democracy works for programming language design.
> 
> Hadley
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] tibbles are not data frames

2017-09-26 Thread J C Nash
Duncan's observation is correct. The background work to the standards
I worked on was a big effort, and the content was a lot smaller than R,
though possibly similar in scope to dealing with the current question.
The "voting" was also very late in the process, after the proposals
were developed, discussed and written, so more a confirmation of a
decision than a vote to do some work.

On the other hand, I do think such effort has to be made from time to
time. On this particular matter I don't feel well-suited. However, the
collective body of material that is R is mostly a result of those of us
who are willing to put out the effort, particularly R-core members.

JN

On 2017-09-26 07:00 PM, Duncan Murdoch wrote:
> On 26/09/2017 4:52 PM, Jens Oehlschlägel wrote:
>>
>> On 26.09.2017 15:37, Hadley Wickham wrote:
>>> I decided to make [.tibble type-stable (i.e. always return a data
>>> frame) because this behaviour causes substantial problems in real data
>>> analysis code. I did it understanding that it would cause some package
>>> developers frustration, but I think it's better for a handful of
>>> package maintainers to be frustrated than hundreds of users creating
>>> dangerous code.g
>>>
>>> Hadley
>>>
>>
>> If that is right -- and I tend to believe it is right -- this change had
>> better been done in R core and not on package level. I think the root of
>> this evil is design inconsistencies of the language together with the
>> lack of removing these inconsistencies. The longer we hesitated, the
>> more packages such a change could break. The lack of addressing issues
>> in R core drives people to try to solve issues on package level. But now
>> we have two conflicting standards, i.e. a fork-within-the-language: Am I
>> a member of the tidyverse or not? Am I writing a package for the
>> tidyverse or for standard-R or for both. With a fork-of-the-language we
>> would at least have a majority vote for one of the two and only the
>> fitter would survive. But with a fork-within-the-language 'R' gets more
>> and more complex, and working with it more and more difficult. There is
>> not only the tidyverse, also the Rcppverse and I don't know how many
>> other verses. If there is no extinction of inconsistencies in R, not
>> sufficient evolution in R, but lots of evolution in Julia, evolution
>> will extinct R together with all its foobarverses in favor of Julia (or
>> Python). May be that's a good thing.
>>
>> I think tibble should respect drop=TRUE and respect the work of all
>> package authors who wrote defensive code and explicitly passed drop=
>> instead of relying on the (wrong) default. Again: better would be a
>> long-term clean-up roadmap of R itself and one simple standard called
>> 'data.frame'. Instead of forking or betting on any particular
>> foobarverse: why not have direct democratic votes about certain critical
>> features of such a long-term roadmap in such a big community?
> 
> 
> I think R Core would not be interested in a vote, because you'd be voting to 
> give them work to do, and that's really rude.
> 
> What would have a better chance of success would be for someone to write a 
> short article describing the proposal in
> detail, and listing all changes to CRAN and Bioconductor packages that would 
> be necessary to implement it.  That's a lot
> of work!  Do you have time to do it?
> 
> Duncan Murdoch
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel

Re: [R-pkg-devel] Problems with too much testing

2021-04-16 Thread J C Nash
Another approach is to change the responsibility.

My feeling is that tests in the TESTING package should be modifiable by the 
maintainer of
the TESTED package, with both packages suspended if the two maintainers cannot 
agree. We
need to be able to move forward when legacy behaviour is outdated or just plain 
wrong. Or,
in the case that I find affects me, when improvements in iterative schemes 
change iterates
slightly. My guess is that Duncan's example is a case in point.

I doubt this will ever occur, as it doesn't seem to be the R way. However, I do 
know that
improvements in methods are not going to CRAN in some cases.

JN

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Problems with too much testing

2021-04-16 Thread J C Nash
I'm generally in accord with Duncan on this. There are inevitably situations 
where general
rules don't apply. Our challenge is to find practical ways to keep the overall 
workload of
all participants in the process to a minimum.

JN

On 2021-04-16 10:18 a.m., Duncan Murdoch wrote:
> On 16/04/2021 9:49 a.m., J C Nash wrote:
>> Another approach is to change the responsibility.
>>
>> My feeling is that tests in the TESTING package should be modifiable by the 
>> maintainer of
>> the TESTED package, with both packages suspended if the two maintainers 
>> cannot agree. We
>> need to be able to move forward when legacy behaviour is outdated or just 
>> plain wrong. Or,
>> in the case that I find affects me, when improvements in iterative schemes 
>> change iterates
>> slightly. My guess is that Duncan's example is a case in point.
>>
>> I doubt this will ever occur, as it doesn't seem to be the R way. However, I 
>> do know that
>> improvements in methods are not going to CRAN in some cases.
> 
> In the cases I've been involved with the authors of the testing package have 
> accepted suggested changes when I've made
> them:  I think that's also part of "the R way".  However, this takes time for 
> both of us:  I need to understand what
> they are intending to test before I can suggest a change to it, and they need 
> to understand my change before they can
> decide if it is acceptable, or whether further changes would also be 
> necessary.
> 
> Github helps a lot with this:  if the testing package is there, I can quickly 
> reproduce the issue, produce a fix, and
> send it to the author, who can tweak it if I've set things up properly.
> 
> For the kinds of changes you're making, I suspect relaxing a tolerance would 
> often be enough, though if you switch
> algorithms and record that in your results, the testing package may need to 
> replace reference values.  I think I'd be
> uncomfortable doing that.
> 
> Duncan Murdoch
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Question about preventing CRAN package archival

2021-06-02 Thread J C Nash
I just downloaded the source matrixcalc package to see what it contained. The 
functions
I looked at seem fairly straightforward and the OP could likely develop 
equivalent features
in his own code, possibly avoiding a function call. Avoiding the function
call means NAMESPACE etc. are not involved, so fewer places for getting into
trouble, assuming the inline code works properly.

JN


On 2021-06-02 12:37 p.m., Duncan Murdoch wrote:
> On 02/06/2021 12:13 p.m., Ben Staton wrote:
>> Hello,
>>
>> I received an email notice from CRAN indicating that my R package
>> ('postpack') will be archived soon if I do not take any action and I want
>> to avoid that outcome. The issue is not caused by my package, but instead a
>> package that my package depends on:
>>
>> "... package 'matrixcalc' is now scheduled for archival on 2021-06-09,
>> and archiving this will necessitate also archiving its strong reverse
>> dependencies."
>>
>> Evidently, xyz has been returning errors on new R builds prompting CRAN to
>> list it as a package to be archived. My package, 'postpack' has
>> 'matrixcalc' listed in the Imports field, which I assume is why I received
>> this email.
>>
>> I want to keep 'postpack' active and don't want it to be archived. I still
>> need package 'matrixcalc' for my package, but not for most functions. Could
>> I simply move package 'matrixcalc' to the Suggests list and submit the new
>> version to CRAN to remove the "Strong Reverse Dependency" issue that
>> triggered this email to avoid CRAN from archiving my package?
> 
> That's part of one solution, but not the best solution.
> 
> If you move it to Suggests, you should make sure that your package checks for 
> it before every use, and falls back to
> some other calculation if it is not present.  Be aware that once it is 
> archived, almost none of your users will have it
> available, so this is kind of like dropping the functions that it supports.
> 
> Another solution which would be great for the community might be for you to 
> offer to take over as maintainer of
> matrixcalc.  Then you'd fix whatever problems it has, and you wouldn't need 
> to worry about it.  I haven't looked at the
> issues so I don't know if this is feasible.
> 
> A third choice would be for you to copy the functions you need from 
> matrixcalc into your own package so you can drop the
> dependency.  This is generally legal under the licenses that CRAN accepts, 
> but you should check anyway.
> 
> A fourth choice would be for you to contact the matrixcalc maintainer, and 
> help them to fix the issues so that
> matrixcalc doesn't get archived.  They may or may not be willing to work with 
> you.
> 
> I'd say my third choice is the best choice in the short term, and 2nd or 4th 
> would be good long term solutions.
> 
> Duncan Murdoch
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Question about preventing CRAN package archival

2021-06-02 Thread J C Nash
As noted by John Harrold and my previous posting, these are not monster codes.
I'd check what I needed and simply work out enough R to make my package work.
Most of these matrix functions are pretty much old-fashioned math translated
into R. I can't see that R will engage lawyers if the OP translates the variable
names to the ones he is using and more or less mimics the bits of code needed.

Cheers, JN


On 2021-06-02 3:15 p.m., John Harrold wrote:
> To add another option. In the past when this has happened to me I've found
> other packages that provide similar functionality.
> 
> I'm assuming that is.square just checks the number of columns == number of
> rows? And the others can probably be implemented pretty easily.
> 
> On Wed, Jun 2, 2021 at 10:41 AM Ben Staton  wrote:
> 
>> My package uses the MIT license, so would that not meet the compatibility
>> requirements?
>>
>> I will attempt to reach out to the package author - thanks for your help!
>>
>> On Wed, Jun 2, 2021 at 10:31 AM Ben Bolker  wrote:
>>
>>> That all sounds exactly right.
>>>GPL >= 2 allows you to use the material without asking permission as
>>> long as your package is compatibly licensed (e.g. also GPL).
>>>Under normal circumstances it would be polite to ask permission, but
>>> if the reason for doing this is that the maintainer is unreachable in
>>> the first place ...
>>>
>>>   If you want to try a little harder, it seems quite possible that you
>>> can reach the matrixcalc maintainer at the (personal) e-mail address
>>> shown in this page:
>>>
>>>
>> https://www.facebook.com/photo/?fbid=10208324530363130&set=ecnf.1000413042
>>>
>>>(Possibly an identity confusion, but I rate that as unlikely based on
>>> other facebook snooping)
>>>
>>>I don't think a short, polite e-mail request would be out of bounds,
>>> they can always ignore it or tell you to go away.
>>>
>>>cheers
>>> Ben Bolker
>>>
>>> On 6/2/21 1:15 PM, Ben Staton wrote:
>>>> Hello,
>>>>
>>>> Thank you for your detailed list of solutions.
>>>>
>>>> I was initially tempted to go with option 1 (move matrixcalc to
>> suggests
>>>> and check for its existence before using functions that rely on it),
>> but
>>> as
>>>> mentioned, this is not a long term fix.
>>>>
>>>> I unfortunately can't take on the responsibilities of option 2
>> (becoming
>>>> the package maintainer) -- there is much that this package does that I
>> do
>>>> not understand, and do not wish to feign authority!
>>>>
>>>> I plan to take option 3 (copy the needed functions into my package).
>>> There
>>>> are only three functions I need from matrixcalc, and all three are
>> fairly
>>>> simple (is.square.matrix
>>>> <https://rdrr.io/cran/matrixcalc/src/R/is.square.matrix.R>,
>>>> is.symmetric.matrix
>>>> <https://rdrr.io/cran/matrixcalc/src/R/is.symmetric.matrix.R>, and
>>>> is.positive.definite
>>>> <https://rdrr.io/cran/matrixcalc/src/R/is.positive.definite.R>) and
>>> there
>>>> is only one function in postpack that needs them. I plan to define them
>>>> within the postpack function. matrixcalc is licensed under GPL >= 2 and
>>>> based on my scan of the license text, this is allowed. Is that correct?
>>>>
>>>> Regarding option 4 (contacting the matrixcalc maintainer), the original
>>>> email from CRAN mentioned that they have attempted to contact the
>> package
>>>> author with no response.
>>>>
>>>> Thank you!
>>>>
>>>> On Wed, Jun 2, 2021 at 9:52 AM J C Nash  wrote:
>>>>
>>>>> I just downloaded the source matrixcalc package to see what it
>>> contained.
>>>>> The functions
>>>>> I looked at seem fairly straightforward and the OP could likely
>> develop
>>>>> equivalent features
>>>>> in his own code, possibly avoiding a function call. Avoiding the
>>> function
>>>>> call means NAMESPACE etc. are not involved, so fewer places for
>> getting
>>>>> into
>>>>> trouble, assuming the inline code works properly.
>>>>>
>>>>> JN
>>>>>
>>>>>
>>>>> On 2021-06-02 12:37 p.m., Duncan Murdoch wrote:

[R-pkg-devel] Windows load error installing package

2021-06-10 Thread J C Nash
Hi,

I'm mentoring Arkajyoti Bhattacharjee for the Google Summer of Code project 
"Improvements to nls()".

Thanks to help from Duncan Murdoch, we have extracted the nls() functionality 
to a package nlspkg and are building
an nlsalt package. We can then run nlspkg::AFunction() and nlsalt::AFunction() 
in a single script to compare.
This works great in Linux, with the packages building and installing under the 
command line or in Rstudio.
But in Windows CMD the "R CMD build" works, but "R CMD INSTALL" gives a number 
of errors of the type

*** arch - i386
C:/RBuildTools/4.0/mingw32/bin/../lib/gcc/i686-w64-mingw32/8.3.0/../../../../i686-w64-mingw32/bin/ld.exe:
loessf.o:loessf.f:(.text+0x650): undefined reference to `idamax_'

The reference is to a BLAS function, so I am fairly certain there is some 
failed pointer, possibly a
makevars.win entry, that we need. So far my searches and (possibly silly) 
attempts to provide links
have failed.

Can anyone provide suggestions?

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Windows load error installing package

2021-06-10 Thread J C Nash
Thanks Dirk:

It looks like R_Windows isn't setting any BLAS or LAPACK. Here is the output 
from sessionInfo() on
my Win10 (VirtualBox VM) and Linux Mint 20.1 systems. However, I've not got any 
idea how to fix
this.

JN


>> sessionInfo()
> R version 4.1.0 (2021-05-18)
> Platform: x86_64-w64-mingw32/x64 (64-bit)
> Running under: Windows 10 x64 (build 19042)
> 
> Matrix products: default
> 
> locale:
> [1] LC_COLLATE=English_United States.1252
> [2] LC_CTYPE=English_United States.1252
> [3] LC_MONETARY=English_United States.1252
> [4] LC_NUMERIC=C
> [5] LC_TIME=English_United States.1252
> 
> attached base packages:
> [1] stats graphics  grDevices utils datasets  methods   base
> 
> loaded via a namespace (and not attached):
> [1] compiler_4.1.0
>>
> 
> 
>> sessionInfo()
> R version 4.1.0 (2021-05-18)
> Platform: x86_64-pc-linux-gnu (64-bit)
> Running under: Linux Mint 20.1
> 
> Matrix products: default
> BLAS:   /usr/lib/x86_64-linux-gnu/openblas-pthread/libblas.so.3
> LAPACK: /usr/lib/x86_64-linux-gnu/openblas-pthread/liblapack.so.3
> 
> locale:
>  [1] LC_CTYPE=en_CA.UTF-8   LC_NUMERIC=C  
>  [3] LC_TIME=en_CA.UTF-8LC_COLLATE=en_CA.UTF-8
>  [5] LC_MONETARY=en_CA.UTF-8LC_MESSAGES=en_CA.UTF-8   
>  [7] LC_PAPER=en_CA.UTF-8   LC_NAME=C 
>  [9] LC_ADDRESS=C   LC_TELEPHONE=C
> [11] LC_MEASUREMENT=en_CA.UTF-8 LC_IDENTIFICATION=C   
> 
> attached base packages:
> [1] stats graphics  grDevices utils datasets  methods   base 
> 
> loaded via a namespace (and not attached):
> [1] compiler_4.1.0
>> 


On 2021-06-10 9:37 a.m., Dirk Eddelbuettel wrote:
> 
> On 10 June 2021 at 09:22, J C Nash wrote:
> | Thanks to help from Duncan Murdoch, we have extracted the nls() 
> functionality to a package nlspkg and are building
> | an nlsalt package. We can then run nlspkg::AFunction() and 
> nlsalt::AFunction() in a single script to compare.
> | This works great in Linux, with the packages building and installing under 
> the command line or in Rstudio.
> | But in Windows CMD the "R CMD build" works, but "R CMD INSTALL" gives a 
> number of errors of the type
> | 
> | *** arch - i386
> | 
> C:/RBuildTools/4.0/mingw32/bin/../lib/gcc/i686-w64-mingw32/8.3.0/../../../../i686-w64-mingw32/bin/ld.exe:
> | loessf.o:loessf.f:(.text+0x650): undefined reference to `idamax_'
> | 
> | The reference is to a BLAS function, so I am fairly certain there is some 
> failed pointer, possibly a
> | makevars.win entry, that we need. So far my searches and (possibly silly) 
> attempts to provide links
> | have failed.
> | 
> | Can anyone provide suggestions?
> 
> Guess: On Linux you use a complete (external) BLAS, on Windows you use the
> (subset) BLAS provided by R which may not have the desired function rending
> your approach less portable.  See what sessionInfo() has to say on both.
> 
> Dirk
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Windows load error installing package SOLVED

2021-06-11 Thread J C Nash
After some flailing around, discovered a posting

https://stackoverflow.com/questions/42118561/error-in-r-cmd-shlib-compiling-c-code

which showed a makevars.win file containing

PKG_LIBS = $(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS)

I had tried several similar such makevars.win files, but trying PKG_LIBS+= and
no spaces. There is mention of the libraries in Writing R Extensions, but given
the heavy use of LAPACK, BLAS and FLIBS, perhaps this example should be there
in the documentation. I've separately noted that Linux sessionInfo() shows
BLAS and LAPACK but Windows does not.

Cheers, JN

On 2021-06-10 9:37 a.m., Dirk Eddelbuettel wrote:
> 
> On 10 June 2021 at 09:22, J C Nash wrote:
> | Thanks to help from Duncan Murdoch, we have extracted the nls() 
> functionality to a package nlspkg and are building
> | an nlsalt package. We can then run nlspkg::AFunction() and 
> nlsalt::AFunction() in a single script to compare.
> | This works great in Linux, with the packages building and installing under 
> the command line or in Rstudio.
> | But in Windows CMD the "R CMD build" works, but "R CMD INSTALL" gives a 
> number of errors of the type
> | 
> | *** arch - i386
> | 
> C:/RBuildTools/4.0/mingw32/bin/../lib/gcc/i686-w64-mingw32/8.3.0/../../../../i686-w64-mingw32/bin/ld.exe:
> | loessf.o:loessf.f:(.text+0x650): undefined reference to `idamax_'
> | 
> | The reference is to a BLAS function, so I am fairly certain there is some 
> failed pointer, possibly a
> | makevars.win entry, that we need. So far my searches and (possibly silly) 
> attempts to provide links
> | have failed.
> | 
> | Can anyone provide suggestions?
> 
> Guess: On Linux you use a complete (external) BLAS, on Windows you use the
> (subset) BLAS provided by R which may not have the desired function rending
> your approach less portable.  See what sessionInfo() has to say on both.
> 
> Dirk
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Windows load error installing package SOLVED - additional note

2021-06-12 Thread J C Nash
Two minor notes:

1) The Writing R Extensions manual, as far as I can determine, does not inform 
package
developers that Makevars.win needs to be in the src/ subdirectory. I followed 
the example
of some other packages to choose where to put it.

2) Also, while I managed to get my package to install with "makevars.win", I 
got a
WARNING on running a CHECK until I replaced it with "Makevars.win", i.e., 
Camel-case
name.

Do these observations merit edits in the manual?

JN


On 2021-06-11 11:16 a.m., J C Nash wrote:
> After some flailing around, discovered a posting
> 
> https://stackoverflow.com/questions/42118561/error-in-r-cmd-shlib-compiling-c-code
> 
> which showed a makevars.win file containing
> 
> PKG_LIBS = $(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS)
> 
> I had tried several similar such makevars.win files, but trying PKG_LIBS+= and
> no spaces. There is mention of the libraries in Writing R Extensions, but 
> given
> the heavy use of LAPACK, BLAS and FLIBS, perhaps this example should be there
> in the documentation. I've separately noted that Linux sessionInfo() shows
> BLAS and LAPACK but Windows does not.
> 
> Cheers, JN
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Windows load error installing package SOLVED - additional note

2021-06-13 Thread J C Nash
Thanks Uwe.

My misunderstanding. I thought the reference was to one of the
"src/" directories in the base R tree. However, I found examples in
some packages for the location.

Windows is alien territory for me, unfortunately.

JN


On 2021-06-12 7:27 p.m., Uwe Ligges wrote:
> 
> 
> On 12.06.2021 16:39, J C Nash wrote:
>> Two minor notes:
>>
>> 1) The Writing R Extensions manual, as far as I can determine, does not 
>> inform package
>> developers that Makevars.win needs to be in the src/ subdirectory. I 
>> followed the example
>> of some other packages to choose where to put it.
> 
> I just searched for Makevars.win in Writing R Extensions and the first 
> occurence is:
> "There are platform-specific file names on Windows: src/Makevars.win"
> 
> So tells you both it should be in src and how to capitlize.
> 
> Best,
> Uwe
> 
> 
>>
>> 2) Also, while I managed to get my package to install with "makevars.win", I 
>> got a
>> WARNING on running a CHECK until I replaced it with "Makevars.win", i.e., 
>> Camel-case
>> name.
>>
>> Do these observations merit edits in the manual?
>>
>> JN
>>
>>
>> On 2021-06-11 11:16 a.m., J C Nash wrote:
>>> After some flailing around, discovered a posting
>>>
>>> https://stackoverflow.com/questions/42118561/error-in-r-cmd-shlib-compiling-c-code
>>>
>>> which showed a makevars.win file containing
>>>
>>> PKG_LIBS = $(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS)
>>>
>>> I had tried several similar such makevars.win files, but trying PKG_LIBS+= 
>>> and
>>> no spaces. There is mention of the libraries in Writing R Extensions, but 
>>> given
>>> the heavy use of LAPACK, BLAS and FLIBS, perhaps this example should be 
>>> there
>>> in the documentation. I've separately noted that Linux sessionInfo() shows
>>> BLAS and LAPACK but Windows does not.
>>>
>>> Cheers, JN
>>>
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R feature suggestion: Duplicated function arguments check

2021-11-08 Thread J C Nash

I think this is similar in nature (though not detail) to an issue raised
on StackOverflow where the OP used "x" in dot args and it clashed with the
"x" in a numDeriv call in my optimx package. I've got a very early fix (I
think), though moderators on StackOverflow were unpleasant enough to
delete my request for the OP to contact me so I could get more
information to make improvements. Sigh. Developers need conversations
with users to improve their code.

Re: argument duplication -- In my view, the first goal should be to inform
the user of the clash. Doing anything further without providing information
is likely a very bad idea, though discussion of possibilities of action after
notification is certainly worthwhile.

Best, JN


On 2021-11-08 11:53 a.m., Duncan Murdoch wrote:

On 08/11/2021 11:48 a.m., Avi Gross via R-package-devel wrote:

Vincent,

But is the second being ignored the right result?

In many programming situations, subsequent assignments replace earlier ones.
And consider the way R allows something like this:

func(a=2, b=3, a=4, c=a*b)

Is it clear how to initialize the default for c as it depends on one value
of "a" or the other?


That c=a*b only works with non-standard tidyverse evaluation.  It causes other problems, e.g. the inability to pass ... 
properly (see https://github.com/tidyverse/glue/issues/231 for an example).


Duncan Murdoch



Of course, you could just make multiple settings an error rather than
choosing an arbitrary fix.

R lists are more like a BAG data structure than a SET.

-Original Message-
From: R-package-devel  On Behalf Of
Vincent van Hees
Sent: Monday, November 8, 2021 11:25 AM
To: Duncan Murdoch 
Cc: r-package-devel@r-project.org
Subject: Re: [R-pkg-devel] R feature suggestion: Duplicated function
arguments check

Thanks Duncan, I have tried to make a minimalistic example:

myfun = function(...) {
   input = list(...)
   mysum = function(A = c(), B= c()) {
 return(A+B)
   }
   if ("A" %in% names(input) & "B" %in% names(input)) {
 print(mysum(A = input$A, B = input$B))
   }
}

# test:

myfun(A = 1, B = 2, B = 4)

[1] 3

# So, the second B is ignored.



On Mon, 8 Nov 2021 at 17:03, Duncan Murdoch 
wrote:


On 08/11/2021 10:29 a.m., Vincent van Hees wrote:

Not sure if this is the best place to post this message, as it is
more

of a

suggestion than a question.

When an R function accepts more than a handful of arguments there is
the risk that users accidentally provide arguments twice, e.g
myfun(A=1, B=2, C=4, D=5, A=7), and if those two values are not the
same it can have frustrating side-effects. To catch this I am
planning to add a check for duplicated arguments, as shown below, in
one of my own functions. I am

now

wondering whether this would be a useful feature for R itself to
operate

in

the background when running any R function that has more than a
certain number of input arguments.

Cheers, Vincent

myfun = function(...) {
    #check input arguments for duplicate assignments
    input = list(...)
    if (length(input) > 0) {
  argNames = names(input)
  dupArgNames = duplicated(argNames)
  if (any(dupArgNames)) {
    for (dupi in unique(argNames[dupArgNames])) {
  dupArgValues = input[which(argNames %in% dupi)]
  if (all(dupArgValues == dupArgValues[[1]])) { # double

arguments,

but no confusion about what value should be
    warning(paste0("\nArgument ", dupi, " has been provided
more

than

once in the same call, which is ambiguous. Please fix."))
  } else { # double arguments, and confusion about what value

should

be,
    stop(paste0("\nArgument ", dupi, " has been provided more
than once in the same call, which is ambiguous. Please fix."))
  }
    }
  }
    }
    # rest of code...
}



Could you give an example where this is needed?  If a named argument
is duplicated, R will catch that and give an error message:

    > f(a=1, b=2, a=3)
    Error in f(a = 1, b = 2, a = 3) :
  formal argument "a" matched by multiple actual arguments

So this can only happen when it is an argument in the ... list that is
duplicated.  But usually those are passed to some other function, so
something like

    g <- function(...) f(...)

would also catch the duplication in g(a=1, b=2, a=3):

    > g(a=1, b=2, a=3)
    Error in f(...) :
  formal argument "a" matched by multiple actual arguments

The only case where I can see this getting by is where you are never
using those arguments to match any formal argument, e.g.

    list(a=1, b=2, a=3)

Maybe this should have been made illegal when R was created, but I
think it's too late to outlaw now:  I'm sure there are lots of people
making use of this.

Or am I missing something?

Duncan Murdoch



[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel

_

Re: [R-pkg-devel] CRAN no longer checking for solaris?

2021-12-06 Thread J C Nash

I'd second Uwe's point. I was one of 31 signatories to first IEEE 754 (I didn't 
participate in
the two more recent releases, as I already tore my hair out with the details of 
low level
bit manipulations). Before the standard, porting code was truly a nightmare. We 
did it
because we had to and were too ignorant to realize what a fools task it was. 
Keep code
portable please.

JN

On 2021-12-06 6:07 a.m., Uwe Ligges wrote:



On 06.12.2021 03:09, Avraham Adler wrote:

Would this mean we could start using little endian bit strings, as I think
only the Solaris platform was big endian (or was it the other way around)?


It depends on the hardware, not the OS.
CRAN checked on Intel CPUs, which are little endian while formerly Solaris was 
typically used on Sparc which is big endian.

In any case, please try to write cross platform code further on. ARM and x86-64 may agree, but we do not know what comes 
next. And old ideas may be revived more quickly than expected:
Not too many people expected 20 years ago that the future of scientific computing in 2021 would still/again happen on 
platforms without support for long doubles / extended precision.


Best,
Uwe Ligges




Avi

On Sun, Dec 5, 2021 at 8:56 PM Dirk Eddelbuettel  wrote:



On 5 December 2021 at 17:23, Travers Ching wrote:
| I see that there doesn't exist a Solaris flavor on any CRAN check page.
| However, I'm certain that Solaris was being checked up until very
recently.
|
| Is this just temporary?
|
| Is there any information for the future of Solaris on CRAN?

No "official" word yet on this list, r-devel or elsewhere, or via commits
to
the CRAN Policy (which a cron job of mine monitors).

But Henrik was eagle-eyed and spotted a number of changes to the svn (or
git
mirror thereof) writing Solaris out of the official documentation:

   https://twitter.com/henrikbengtsson/status/1466877096471379970

So yes it seems like an era is coming to a close.

Dirk

--
https://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Note on submission Found the following files/directories: 'NUL'

2022-11-25 Thread J C Nash

FWIW: optimx::optimx is outdated and only there for legacy use.
Better to use the optimx::optimr() function for single solvers.

JN


On 2022-11-25 05:10, Ivan Krylov wrote:

В Fri, 25 Nov 2022 09:59:10 +
"ROTOLO, Federico /FR"  пишет:


When submitting my package parfm, I get the following note
Flavor: r-devel-linux-x86_64-debian-gcc
Check: for non-standard things in the check directory, Result: NOTE
   Found the following files/directories:
 'NUL'
so that my submission is rejected.

I cannot find any file or directory called NUL in my package.
Do you have any suggestion?


The file gets created during the check when you call sink('NUL'):
https://github.com/cran/parfm/blob/8c3f45291514aedde67cecf0b090ddd3487f3ada/R/parfm.R#L260-L299

It mostly works on Windows, where "nul" with any extension in any
directory is the null file, but it creates a file named 'NUL' on other
operating systems. It also breaks the non-default sink, if any was set
up by the user.

Does optimx::optimx produce output that can't be turned off otherwise?
(Does it help to set control$trace = 0?) Have you tried
suppressMessages() or capture.output() with nullfile()?



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Discovering M1mac cowpads

2023-03-24 Thread J C Nash

Recently I updated my package nlsr and it passed all the usual checks and was
uploaded to CRAN. A few days later I got a message that I should "fix" my
package as it had failed in "M1max" tests.

The "error" was actually a failure in a DIFFERENT package that was used as
an example in a vignette. I fixed it in my vignette with try(). However, I
am interested in just where the M1 causes trouble.

As far as I can determine so far, for numerical computations, differences will
show up only when a package is able to take advantage of extended precision
registers in the IEEE arithmetic. I think this means that in pure R, it won't
be seen. Packages that call C or Fortran could do so. However, I've not yet
got a good handle on this.

Does anyone have some small, reproducible examples? (For me, reproducing so
far means making a small package and submitting to macbuilder, as I don't
have an M1 Mac.)

Cheers,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] files katex.js and katex-config.js not found in R CMD check --as-cran

2023-05-09 Thread J C Nash

In updating my nlsr package, I ran R CMD check --as-cran and got an error
that /usr/lib/R/doc/html/katex/katex.js was not found.

I installed the (large!) r-cran-katex. No joy.

katex.js was in /usr/share/R/doc/html/katex/  so I created a symlink. Then
I got katex-config.js not found (but in 1 directory up).
So

sudo ln -s /usr/share/R/doc/html/katex-config.js katex-config.js

Then I get the check to run OK. So I'm now up and running, but others might not
be so persistent.

Is there a glitch in my system? Or is this a bug in the latest R CMD check 
--as-cran?
(Or at least a configuration issue?)

Best,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Changes to checks on NEWS?

2023-07-26 Thread J C Nash



In work on an upgrade to my optimx package, I added to my (plain text) NEWS
file.

The lines

VERSION 2023-06-25

  o This is a MAJOR revision and overhaul of the optimx package and its 
components.
  o Fixed CITATION file based on R CMD check --as-cran complaints
regarding requirement for person() and bibentry() changes.

pass R CMD check --as-cran

but

VERSION 2023-06-25

This is a MAJOR revision and overhaul of the optimx package and its 
components.
  o Fixed CITATION file based on R CMD check --as-cran complaints
regarding requirement for person() and bibentry() changes.

give a NOTE that news cannot process the chunk/lines in NEWS.

R CMD checkpasses. (i.e., CRAN checks are tripping the NOTE).

I don't see anything about this in Writing R Extensions at moment.

Does anyone have information on what may have changed. I'd like to avoid NOTEs 
if possible,
and since I'm using a plain-text NEWS, don't believe this should trigger one.

The version that passes was the result of some almost random tries to see what 
would
trigger a note.

Cheers,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Changes to checks on NEWS?

2023-07-26 Thread J C Nash

Thanks for the info, though it seems odd that CRAN wants to
parse a plain text file that is purely for information, since it
should have no impact on the current package or any other. I suppose
there might be character set issues to check. The motive for parsing
it eludes me.

Does anyone know if there are plans to use NEWS for some purpose in
the future i.e., to actually track changes beyond package maintainer's
comments?

Cheers, and thanks again.

JN


On 2023-07-26 10:03, Ivan Krylov wrote:

В Wed, 26 Jul 2023 09:37:38 -0400
J C Nash  пишет:


I'd like to avoid NOTEs if possible, and since I'm using a plain-text
NEWS, don't believe this should trigger one.


Plain-text NEWS files are parsed according to the rules specified in
help(news), which is admittedly laconic in its description. If you run
tools:::.news_reader_default('https://cran.r-project.org/web/packages/optimx/NEWS')
(or news(package = 'optimx')), you can see that R's news() already
misunderstands some of the contents of your NEWS file.

A relatively recent change (r82543, July 2022) set
_R_CHECK_NEWS_IN_PLAIN_TEXT_=TRUE for R CMD check --as-cran and started
verifying that R's plain text "news reader" function could actually
parse plain-text NEWS files without warnings or errors.

I think that if you rename NEWS to ChangeLog, R will leave the file
alone, but CRAN will offer it to users as plain text.



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Changes to checks on NEWS?

2023-07-26 Thread J C Nash

The important information is in the body of the man page for news(),
i.e., found by
   ?utils::news

and this explains why putting an "o" in front of a line clears the
NOTE. Once I realized that CRAN is running this, I could see the
"why". Thanks.

JN

On 2023-07-26 10:25, Duncan Murdoch wrote:


NEWS has been used for a long time by the utils::news() function, which in turn 
is used by the HTML help system.

Duncan Murdoch



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] CRAN test complaints about package that passes most platforms

2023-08-11 Thread J C Nash

My nlsr package was revised mid-February. After CRAN approved it, I got a
message that it was "failing" M1Mac tests. The issue turned out to be ANOTHER
package that was being used in an example in a vignette. Because M1 does not
provide the IEEE 754 80 bit registers, a method in package minqa did not
"converge", that is, it did not pass a termination test. Relaxing a tolerance
got a "pass" on the test service for M1 Mac then available. This issue can
be found by searching the web, though it probably deserves some clarity in
R documentation somewhere. The presentation of such problems can, of course,
take many forms.

There was a minor revision to nlsr in May to rationalize the names of some 
functions
to produce summary information about solutions. This seemed to give no issues 
until
now.

Two days ago, however, I received a msg that the (unchanged!) package is 
failing tests
on M1 and on Fedora clang r-devel tests in building some vignettes. The messages
are about pandoc and a missing file "framed.sty". All other tests showing on
CRAN are OK. When I try with R-hub I seem to get even more complaints than
the messages from CRAN, but about the same issues, and about vignette
building.

2 queries:

- Is anyone else getting similar messages? If so, it may be useful to share
notes to try to get this resolved. It seems within reason that the issue is
some unfortunate detail in Fedora and M1 that interacts with particular
syntax in the vignette, or that the setup of those machines is inadequate.
Comparing notes may reveal what is causing complaints and help to fix either
in the .Rmd vignettes or in the pandoc structure.

- Is there an M1Mac test platform to which packages can be submitted? Brian
Ripley did have one, but trying the link I used before seems not to present
a submission dialog.

I'd like to be helpful, but have a suspicion that a humble package developer
is being used as a stooge to find and fix software glitches outside of R. 
However,
if it's a matter of an unfortunate mismatch of document and processor, I'll be
happy to help document and fix it.

It would be a pity if vignettes cause enough trouble that developers simply 
don't
include them.

Best,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] URL syntax causes R CMD build failure - a fix

2023-09-02 Thread J C Nash

I'm posting this in case it helps some other developers getting build failure.

Recently package nlsr that I maintain got a message that it failed to build on
some platforms. The exact source of the problem is still to be illuminated,
but seems to be in knitr::render and/or pandoc or an unfortunate interaction.
An update to pandoc triggered a failure to process a vignette that had been
happily processed for several years. The error messages are unhelpful, at least
to me,

   Error at "nlsr-devdoc.knit.md" (line 5419, column 1):
   unexpected end of input
   Error: pandoc document conversion failed with error 64
   Execution halted

Unfortunately, adding "keep_md: TRUE" (you need upper case TRUE to save it when
there is no error of this type), did not save the intermediate file in this
case. However, searching for "pandoc error 64" presented one web page where the 
author
used brute force search of his document by removing / replacing sections to find
the line(s) that caused trouble. This is a little tedious, but effective. In my
case, the offending line turned out to be a copied and pasted URL

https://en.wikipedia.org/wiki/Levenberg%E2%80%93Marquardt_algorithm

The coded characters can be replaced by a hyphen, to give,

https://en.wikipedia.org/wiki/Levenberg-Marquardt_algorithm

and this, when pasted in Mozilla Firefox at least, will go to the appropriate
wikipedia page.

I'd be interested in hearing from others who have had similar difficulties. I
suspect this is relatively rare, and causing some sort of infelicity in the
output of knitr::render that then trips up some versions of pandoc, that may,
for instance, be now applying stricter rules to URL syntax.

Best,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] URL syntax causes R CMD build failure - a fix

2023-09-03 Thread J C Nash

Thanks Uwe. I think you may have the reason, esp. if the url is output as LaTex
formatted text to the intermediate files.

> Where is it in your package and what is the R CMD check output?

The issue was a failure to build the nlsr-devdoc.Rmd vignette. Unfortunately, 
the
messages were as below (indented). The package passed in May and earlier, and
failure occurred when pandoc updated recently on some platforms, so there is 
some
change in the toolchain that triggered this. While pandoc is a wonderful tool, 
its
message output can be very unhelpful. I've had difficulties outside of R 
converting
LaTex to epub for some of my historical novels. In those cases I've also seen 
apparent
success with "bits left out" which does upset the readers when the story has 
gaps.
I've never resolved the "why?", but have managed to work around, sometimes by 
simply
adding a vertical space (i.e., line ending), or otherwise rearranging text. It 
would
be nice to know the reason, but as with the present issue, the reward is not 
worth
the effort.

I've also seen awkwardness with currency symbols, though that may be my own 
lack of
detailed knowledge with LaTex. I need multiple currencies in some stories, and 
end
up editing with Sigil. I anticipate that some R users with vignettes that have
several currencies might want to check output.

Whether or not the url syntax that caused the present trouble is valid or not, 
the
percentage signs are likely worth avoiding if possible.

Thanks,

JN

On 2023-09-03 10:29, Uwe Ligges wrote:

John can you point us to an example?
Where is it in your package and what is the R CMD check output?

Guess: Within an Rd file you have to escape the %  characters otherwise they 
start a comment.

Best,
Uwe Ligges



On 03.09.2023 00:30, Spencer Graves wrote:
I've encountered similar issues. However, it has been long enough ago that I don't remember enough details to say more 
without trying to update my CRAN packages to see what messages I get and maybe researching my notes from previous 
problems of this nature. Spencer Graves



On 9/2/23 4:23 PM, Greg Hunt wrote:

The percent encoded characters appear to be valid in that URL, suggesting
that rejecting them is an error. That kind of error could occur when the
software processing them converts them back to a non-unicode character set.

On Sun, 3 Sep 2023 at 4:34 am, J C Nash  wrote:


I'm posting this in case it helps some other developers getting build
failure.

Recently package nlsr that I maintain got a message that it failed to
build on
some platforms. The exact source of the problem is still to be illuminated,
but seems to be in knitr::render and/or pandoc or an unfortunate
interaction.
An update to pandoc triggered a failure to process a vignette that had been
happily processed for several years. The error messages are unhelpful, at
least
to me,

 Error at "nlsr-devdoc.knit.md" (line 5419, column 1):
 unexpected end of input
 Error: pandoc document conversion failed with error 64
 Execution halted

Unfortunately, adding "keep_md: TRUE" (you need upper case TRUE to save it
when
there is no error of this type), did not save the intermediate file in this
case. However, searching for "pandoc error 64" presented one web page
where the author
used brute force search of his document by removing / replacing sections
to find
the line(s) that caused trouble. This is a little tedious, but effective.
In my
case, the offending line turned out to be a copied and pasted URL

https://en.wikipedia.org/wiki/Levenberg%E2%80%93Marquardt_algorithm

The coded characters can be replaced by a hyphen, to give,

https://en.wikipedia.org/wiki/Levenberg-Marquardt_algorithm

and this, when pasted in Mozilla Firefox at least, will go to the
appropriate
wikipedia page.

I'd be interested in hearing from others who have had similar
difficulties. I
suspect this is relatively rare, and causing some sort of infelicity in the
output of knitr::render that then trips up some versions of pandoc, that
may,
for instance, be now applying stricter rules to URL syntax.

Best,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel



[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Issue of itemize in man file

2023-10-22 Thread J C Nash

I'm doing a major update of the optimx package and things were going relatively
smoothly until this weekend when files that have passed win-builder gave NOTEs
on r-devel for several manual (.Rd) files.
The NOTE is of the form


* checking Rd files ... NOTE
checkRd: (-1) fnchk.Rd:40-41: Lost braces in \itemize; \value handles \item{}{} 
directly
checkRd: (-1) fnchk.Rd:43: Lost braces in \itemize; \value handles \item{}{} 
directly
checkRd: (-1) fnchk.Rd:45: Lost braces in \itemize; \value handles \item{}{} 
directly



The source of this looks like

  \item{msg}{A text string giving information about the result of the function 
check: Messages and
the corresponding values of \code{excode} are:
  \itemize{
\item{fnchk OK;}{ \code{excode} = 0;
   \code{infeasible} = FALSE}
\item{Function returns INADMISSIBLE;}
{ \code{excode} = -1; \code{infeasible} = TRUE}
 ...
}

I've not seen this before, nor does a search give any hits.

Does anyone have any ideas? Or is this a glitch in r-devel as things are tried 
out.?

I don't get the NOTE on win-builder R-release nor on local R CMD check. Note 
that the
\itemize is to give a second-level list i.e., for expanding output of one of the
\value objects returned, namely the return codes.

Best,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Issue of itemize in man file

2023-10-22 Thread J C Nash

Thanks. That seems to be the issue. Also vincent's suggestion of checkRd.

JN

On 2023-10-22 10:52, Ivan Krylov wrote:

On Sun, 22 Oct 2023 10:43:08 -0400
J C Nash  wrote:


\itemize{
  \item{fnchk OK;}{ \code{excode} = 0;
 \code{infeasible} = FALSE}


The \item command inside \itemize{} lists doesn't take arguments.
Did you mean \describe{} instead of \itemize{}?



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Rmarkdown fails if (quote) r (space) is used

2023-11-03 Thread J C Nash

I've spent a couple of hours with an Rmarkdown document where I
was describing some spherical coordinates made up of a radius r and
some angles. I wanted to fix the radius at 1.

In my Rmarkdown text I wrote

   Thus we have `r = 1` ...

This caused failure to render with "unexpected =". I was using Rstudio
at first and didn't see the error msg.

If I use "radius R" and `R = 1`, things are fine, or `r=1` with no space,
but the particular "(quote) r (space)" seems to trigger code block processing.

Perhaps this note can save others some wasted time.

I had thought (obviously incorrectly) that one needed ```{r something}
to start the code chunk.

JN

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Rmarkdown fails if (quote) r (space) is used

2023-11-03 Thread J C Nash

Yes. An initial space does the trick. Thanks. J

On 2023-11-03 11:48, Serguei Sokol wrote:

Le 03/11/2023 à 15:54, J C Nash a écrit :

I've spent a couple of hours with an Rmarkdown document where I
was describing some spherical coordinates made up of a radius r and
some angles. I wanted to fix the radius at 1.

In my Rmarkdown text I wrote

    Thus we have `r = 1` ...

To avoid a confusion between inline code and fixed font typesetting, could it be

    Thus we have ` r = 1` ...

(with a space after an opening quote)?

Best,
Serguei.



This caused failure to render with "unexpected =". I was using Rstudio
at first and didn't see the error msg.

If I use "radius R" and `R = 1`, things are fine, or `r=1` with no space,
but the particular "(quote) r (space)" seems to trigger code block processing.

Perhaps this note can save others some wasted time.

I had thought (obviously incorrectly) that one needed ```{r something}
to start the code chunk.

JN

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel




__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] How to get arbitrary precise inputs from R for an Rcpp package?

2024-07-18 Thread J C Nash

On the other hand, Rmpfr did allow me to write an mpfr rootfinder for Martin M. 
in 2011
(he modified and streamlined it for use) since one can run R codes on mpfr 
objects
as long as one is careful which operators are applied. So one probably could do
something with a very pedestrian eigenvalue method. One or other of the codes
from my Compact Numerical Methods for Computers might be applicable, but the 1st
edition Step-And-Description codes are likely to be more useful than the Pascal
of the Second Edition. The one-sided Jacobi approach for eigen or svd has a 
pretty
tiny code. No claims for great efficiency, but in times when loading from 
floppies
was the order of the day, the short code often got the lowest start to finish 
elapsed
time.

Best, JN


On 2024-07-18 19:50, Simon Urbanek wrote:

Khue,



On 19/07/2024, at 11:32 AM, Khue Tran  wrote:

Thank you for the suggestion, Denes, Vladimir, and Dirk. I have indeed
looked into Rmpfr and while the package can interface GNU MPFR with R
smoothly, as of right now, it doesn't have all the functions I need (ie.
eigen for mpfr class) and when one input decimals, say 0.1 to mpfr(), the
precision is still limited by R's default double precision.




Don't use doubles, use decimal fractions:


Rmpfr::mpfr(gmp::as.bigq(1,10), 512)

1 'mpfr' number of precision  512   bits
[1] 
0.1002

As for eigen() - I'm not aware of an arbitrary precision solver, so I think the 
inputs are your least problem - most tools out there use LAPACK which doesn't 
support arbitrary precision so your input precision is likely irrelevant in 
this case.

Cheers,
Simon




Thank you for the note, Dirk. I will keep in mind to send any future
questions regarding Rcpp to the Rcpp-devel mailing list. I understand that
the type used in the Boost library for precision is not one of the types
supported by SEXP, so it will be more complicated to map between the cpp
codes and R. Given Rmpfr doesn't provide all necessary mpfr calculations
(and embarking on interfacing Eigen with Rmpfr is not a small task), does
taking input as strings seem like the best option for me to get precise
inputs?

Sincerely,
Khue

On Fri, Jul 19, 2024 at 8:29 AM Dirk Eddelbuettel  wrote:



Hi Khue,

On 19 July 2024 at 06:29, Khue Tran wrote:
| I am currently trying to get precise inputs by taking strings instead of
| numbers then writing a function to decompose the string into a rational
| with the denominator in the form of 10^(-n) where n is the number of
| decimal places. I am not sure if this is the only way or if there is a
| better method out there that I do not know of, so if you can think of a
| general way to get precise inputs from users, it will be greatly
| appreciated!

That is one possible way. The constraint really is that the .Call()
interface
we use for all [1] extensions to R only knowns SEXP types which map to a
small set of known types: double, int, string, bool, ...  The type used by
the Boost library you are using is not among them, so you have to add code
to
map back and forth. Rcpp makes that easier; it is still far from automatic.

R has packages such as Rmpfr interfacing GNU MPFR based on GMP. Maybe that
is
good enough?  Also note that Rcpp has a dedicated (low volume and friendly)
mailing list where questions such as this one may be better suited.

Cheers, Dirk

[1] A slight generalisation. There are others but they are less common /
not
recommended.

--
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org



[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Saving output of check()

2018-04-11 Thread J C Nash
Hi,

In trying to test that an upgrade to my optimx package does not break other
packages, I wanted to loop over a list of all such packages in alldep, with
nall the length of this list.

cat("Check the dependent packages\n")
for (ii in 1:nall){
  cpkg <- alldep[ii]
  dd <- "/home/john/temp/wrkopt/dlpkg"
  dlname <- download.packages(cpkg, destdir=dd )[[2]]
  cat("Downloaded ", dlname,"\n")
  cpkg.chk <- devtools::check_built(dlname)
  cat("Results package:",cpkg,"\n")
  print(cpkg.chk)
}

Before running this, I did

sink("dpkgcheck.txt", split=TRUE)

and afterwards, I did sink().

But ... none of the check output, nor the result of the final print, show
up in the output file dpkgcheck.txt.

Have I totally misunderstood sink(), or is there a nasty bug?

I've tried running in Rstudio and in the terminal. I'm running Linux Mint
18.3 Sylvia.

Linux john-j6-18 4.10.0-38-generic #42~16.04.1-Ubuntu SMP Tue Oct 10 16:32:20 
UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
john@john-j6-18 ~ $ R

R version 3.4.4 (2018-03-15) -- "Someone to Lean On"


J C Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Saving output of check()

2018-04-11 Thread J C Nash
I got several responses to my query. Henrik's does suggest "why", but I
am rather unhappy that R has this weakness. (See below for a sort of
workaround for Linux users.)

In particular, note that the check_built() function DOES return an object,
but it does NOT print().

In fact, putting alldep <- "embryogrowth" gives a result file

> Check the dependent packages
> Downloaded  /home/john/temp/wrkopt/dlpkg/embryogrowth_7.4.tar.gz 
> Results package: embryogrowth 
> 

while the bottom of the terminal file gives

> * checking data for non-ASCII characters ... OK
> * checking data for ASCII and uncompressed saves ... OK
> * checking examples ... OK
> * DONE
> 
> Status: OK
> 
> Results package: embryogrowth 
> R CMD check results
> 0 errors | 0 warnings | 0 notes
> 
>> 
>> sink()
>> 

Now the object cpkg.chk is still present, so I continued the exercise (terminal
copy here)

> 
>> ls()
> [1] "alldep"   "cpkg" "cpkg.chk" "dd"   "dlname"   "ii"   "nall"  
>   
>> sink("sinktest2.txt", split=TRUE)
>> cpkg.chk
> R CMD check results
> 0 errors | 0 warnings | 0 notes
> 
>> print(cpkg.chk)
> R CMD check results
> 0 errors | 0 warnings | 0 notes
> 
>> cat("note the above use just the object name as well as print()\n")
> note the above use just the object name as well as print()
>> sink()
>> 

but the file sinktest2.txt is just

> 
> note the above use just the object name as well as print()

Perhaps this isn't a bug, but it rather smells like one, especially the
failure to show the cpkg.chk.

Workaround for Linux: Run things via

R |& tee -a myteeoutput.txt

This will keep all the output (sink not needed). But it isn't quite as nice
for keeping the data.

I've also not managed to find a way to get the information out of the cpkg.chk
object. If someone knows how to do that, it would help.

Best, JN








On 2018-04-11 03:24 PM, Henrik Bengtsson wrote:
> R CMD check, which is used internally runs checks in standalone
> background R processes.  Output from these is not capturable/sinkable
> by the master R process.  The gist of what's happening is:
> 
>> sink("output.log")
>> system("echo hello")  ## not sinked/captured
> hello
>> sink()
>> readLines("output.log")
> character(0)
> 
> /Henrik
> 
> On Wed, Apr 11, 2018 at 11:05 AM, J C Nash  wrote:
>> Hi,
>>
>> In trying to test that an upgrade to my optimx package does not break other
>> packages, I wanted to loop over a list of all such packages in alldep, with
>> nall the length of this list.
>>
>> cat("Check the dependent packages\n")
>> for (ii in 1:nall){
>>   cpkg <- alldep[ii]
>>   dd <- "/home/john/temp/wrkopt/dlpkg"
>>   dlname <- download.packages(cpkg, destdir=dd )[[2]]
>>   cat("Downloaded ", dlname,"\n")
>>   cpkg.chk <- devtools::check_built(dlname)
>>   cat("Results package:",cpkg,"\n")
>>   print(cpkg.chk)
>> }
>>
>> Before running this, I did
>>
>> sink("dpkgcheck.txt", split=TRUE)
>>
>> and afterwards, I did sink().
>>
>> But ... none of the check output, nor the result of the final print, show
>> up in the output file dpkgcheck.txt.
>>
>> Have I totally misunderstood sink(), or is there a nasty bug?
>>
>> I've tried running in Rstudio and in the terminal. I'm running Linux Mint
>> 18.3 Sylvia.
>>
>> Linux john-j6-18 4.10.0-38-generic #42~16.04.1-Ubuntu SMP Tue Oct 10 
>> 16:32:20 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
>> john@john-j6-18 ~ $ R
>>
>> R version 3.4.4 (2018-03-15) -- "Someone to Lean On"
>>
>>
>> J C Nash
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Saving output of check()

2018-04-11 Thread J C Nash
Another workaround is to use

tlogl <- readLines(attr(cpkg.chk, "path"))

Possibly this may suggest a way to improve functionality.

JN

On 2018-04-11 03:24 PM, Henrik Bengtsson wrote:
> R CMD check, which is used internally runs checks in standalone
> background R processes.  Output from these is not capturable/sinkable
> by the master R process.  The gist of what's happening is:
> 
>> sink("output.log")
>> system("echo hello")  ## not sinked/captured
> hello
>> sink()
>> readLines("output.log")
> character(0)
> 
> /Henrik
> 
> On Wed, Apr 11, 2018 at 11:05 AM, J C Nash  wrote:
>> Hi,
>>
>> In trying to test that an upgrade to my optimx package does not break other
>> packages, I wanted to loop over a list of all such packages in alldep, with
>> nall the length of this list.
>>
>> cat("Check the dependent packages\n")
>> for (ii in 1:nall){
>>   cpkg <- alldep[ii]
>>   dd <- "/home/john/temp/wrkopt/dlpkg"
>>   dlname <- download.packages(cpkg, destdir=dd )[[2]]
>>   cat("Downloaded ", dlname,"\n")
>>   cpkg.chk <- devtools::check_built(dlname)
>>   cat("Results package:",cpkg,"\n")
>>   print(cpkg.chk)
>> }
>>
>> Before running this, I did
>>
>> sink("dpkgcheck.txt", split=TRUE)
>>
>> and afterwards, I did sink().
>>
>> But ... none of the check output, nor the result of the final print, show
>> up in the output file dpkgcheck.txt.
>>
>> Have I totally misunderstood sink(), or is there a nasty bug?
>>
>> I've tried running in Rstudio and in the terminal. I'm running Linux Mint
>> 18.3 Sylvia.
>>
>> Linux john-j6-18 4.10.0-38-generic #42~16.04.1-Ubuntu SMP Tue Oct 10 
>> 16:32:20 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
>> john@john-j6-18 ~ $ R
>>
>> R version 3.4.4 (2018-03-15) -- "Someone to Lean On"
>>
>>
>> J C Nash
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Saving output of check()

2018-04-11 Thread J C Nash
Indeed these are useful for one of my present tasks. Thanks. JN

On 2018-04-11 03:10 PM, Georgi Boshnakov wrote:
> 
> Hi, 
> 
> Not really an answer but I only recently discovered  devtools::revdep(), 
> which automates checking reverse dependencies. 
> 
> Georgi Boshnakov
> 
> 
> 
> 
> 
> From: R-package-devel [r-package-devel-boun...@r-project.org] on behalf of J 
> C Nash [profjcn...@gmail.com]
> Sent: 11 April 2018 19:05
> To: List r-package-devel
> Subject: [R-pkg-devel] Saving output of check()
> 
> Hi,
> 
> In trying to test that an upgrade to my optimx package does not break other
> packages, I wanted to loop over a list of all such packages in alldep, with
> nall the length of this list.
> 
> cat("Check the dependent packages\n")
> for (ii in 1:nall){
>   cpkg <- alldep[ii]
>   dd <- "/home/john/temp/wrkopt/dlpkg"
>   dlname <- download.packages(cpkg, destdir=dd )[[2]]
>   cat("Downloaded ", dlname,"\n")
>   cpkg.chk <- devtools::check_built(dlname)
>   cat("Results package:",cpkg,"\n")
>   print(cpkg.chk)
> }
> 
> Before running this, I did
> 
> sink("dpkgcheck.txt", split=TRUE)
> 
> and afterwards, I did sink().
> 
> But ... none of the check output, nor the result of the final print, show
> up in the output file dpkgcheck.txt.
> 
> Have I totally misunderstood sink(), or is there a nasty bug?
> 
> I've tried running in Rstudio and in the terminal. I'm running Linux Mint
> 18.3 Sylvia.
> 
> Linux john-j6-18 4.10.0-38-generic #42~16.04.1-Ubuntu SMP Tue Oct 10 16:32:20 
> UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
> john@john-j6-18 ~ $ R
> 
> R version 3.4.4 (2018-03-15) -- "Someone to Lean On"
> 
> 
> J C Nash
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] CRAN pretest archived because of 2 NOTEs

2018-04-18 Thread J C Nash
If NOTEs are going to be treated as errors, then a lot of infrastructure (all my
packages for optimization and nonlinear modelling, which are dependencies of
a few dozen other packages etc.) will disappear. This is because they have 
version
numbering I've been using in some form that pre-dates R and uses -M(M).D(D).
e.g., NOTE "Version contains large components (2018-3.28)"

I believe changing it to a "smaller" value will mean the submission is refused
on an ERROR since the numbering will be out of order.

So perhaps it is time either to revisit NOTEs to drop some unnecessary ones,
and also to make some careful decisions and change critical ones to WARNINGs or
ERRORs.

One of the major concerns I have is that it is desirable that CRAN be the
true repository for R packages, and that increased restrictions -- especially
if unnecessary -- will surely increase the movement to informal distribution
on other platforms like Github. Such fragmentation of the package universe
weakens R as a resource, and we see quite a lot of this recently.

I'm strongly in favour of having fairly strict standards, but also of ensuring
that only necessary restrictions are enforced. Even more, I believe we must
keep working to make satisfying the standards as easy as possible. R has done
a good job of this, but there is always room to improve.

JN




On 2018-04-18 01:40 PM, Hadley Wickham wrote:
> For the purposes of CRAN submission, you should basically treat every
> NOTE as an ERROR.
> 
> Hadley
> 
> On Wed, Apr 18, 2018 at 3:36 AM, Gertjan van den Burg
>  wrote:
>> While waiting to get this message posted to the list, I've solved the
>> problem by copying the stdlib rand() and srand() functions into my package
>> under a different name. This makes the check pass and ensures my RNG does
>> not interfere with R's RNG.
>>
>> I do think that if this NOTE causes immediate dismissal of a package, it
>> shouldn't be a NOTE but an ERROR. Otherwise it just leads to a lot of wasted
>> time waiting for a reply from the maintainers to respond to the note.
>>
>>> Dear all,
>>>
>>> My CRAN submission doesn't pass the pre-tests and gets archived. I've
>>> emailed cran-submissi...@r-project.org explaining that these are false
>>> positives, but since I haven't heard back in 10 days I don't think anyone
>>> read that. Same thing for the submission comments (which also explained
>>> it).
>>>
>>> The first note is:
>>>
>>> * checking CRAN incoming feasibility ... NOTE
>>> Maintainer: ‘Gertjan van den Burg ’
>>>
>>> New submission
>>>
>>> Possibly mis-spelled words in DESCRIPTION:
>>>GenSVM (8:18, 10:61, 15:2, 16:26, 19:11)
>>>Multiclass (4:22)
>>>SVMs (14:25, 15:42)
>>>misclassifications (11:49)
>>>multiclass (8:53, 14:14, 15:31)
>>>
>>>
>>> These words are not mis-spelled, so this is a false positive.
>>>
>>> The second note is:
>>>
>>> * checking compiled code ... NOTE
>>> File ‘gensvm/libs/gensvm_wrapper.so’:
>>>Found ‘rand’, possibly from ‘rand’ (C)
>>>  Objects: ‘gensvm/src/gensvm_cv_util.o’, ‘gensvm/src/gensvm_init.o’,
>>>‘gensvm/lib/libgensvm.a’
>>>Found ‘srand’, possibly from ‘srand’ (C)
>>>  Objects: ‘gensvm/src/gensvm_train.o’, ‘gensvm/lib/libgensvm.a’
>>>
>>> Compiled code should not call entry points which might terminate R nor
>>> write to stdout/stderr instead of to the console, nor use Fortran I/O
>>> nor system RNGs.
>>>
>>> See ‘Writing portable packages’ in the ‘Writing R Extensions’ manual.
>>>
>>>
>>> This is probably why the package is rejected. I have a valid use case for
>>> using rand() and srand(): I'm trying to maintain compatibility of this
>>> package with the corresponding Python package. By using rand en srand
>>> users
>>> can reproduce models in both languages.
>>>
>>> Does anyone have any ideas on how I can get the package excepted to CRAN?
>>>
>>> Thanks,
>>>
>>> Gertjan van den Burg
>>>
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 
> 
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] SIAM Wilkinson prize

2018-05-18 Thread J C Nash
It occurs to me that there could be packages developed by early career R 
developers that might fit
this prize which is considered quite prestigious (not to mention the cash) in 
the numerical methods community.
It is also likely that people may not be aware of the award in the R community.

Cheers, JN



 Forwarded Message 
Subject:[SIAM-OPT] June 1 Entry Deadline - James H. Wilkinson Prize for 
Numerical Software
Date:   Thu, 17 May 2018 14:22:41 +
From:   SIAM Prize Program 
CC: Optimization SIAG mailing list 



James H. Wilkinson Prize for Numerical Software

*cid:image001.png@01D29F3D.6ECC9B50* 


The deadline is June 1 for entries for the James H. Wilkinson Prize for 
Numerical Software
. We are looking 
for submissions of high-quality numerical
software from early career teams. If you or your team are developing numerical 
software for scientific computing, act as
a nominator and enter your software for the prize. To submit an entry, you 
first need to create an account at the SIAM
Prize Portal. Click on the “Submit” button above to start the process.

The James H. Wilkinson Prize for Numerical Software is awarded every four years 
to the authors of an outstanding piece
of numerical software. The prize is awarded for an entry that best addresses 
all phases of the preparation of
high-quality numerical software. It is intended to recognize innovative 
software in scientific computing and to
encourage researchers in the earlier stages of their career.

SIAM will award the Wilkinson Prize for Numerical Software at the SIAM 
Conference on Computational Science and
Engineering (CSE19). The award will consist of $3,000 and a plaque. As part of 
the award, the recipient(s) will be
expected to present a lecture at the conference.





*Eligibility Criteria:*

Selection will be based on: clarity of the software implementation and 
documentation, importance of the application(s)
addressed by the software; portability, reliability, efficiency, and usability 
of the software implementation; clarity
and depth of analysis of the algorithms and the software in the accompanying 
paper; and quality of the test software.

Candidates must have worked in mathematics or science for at most 12 years 
(full time equivalent) after receiving their
PhD as of January 1 of the award year, allowing for breaks in continuity. The 
prize committee can make exceptions, if in
their opinion the candidate is at an equivalent stage in their career.

For the 2019 award, a candidate must have received their PhD no earlier than 
January 1, 2007.



*Entry Deadline:*

*June 1, 2018*



*Required Materials:*

· CVs of the authors of the software, at most two pages per author (PDF)

· A two-page summary of the main features of the algorithm and software 
implementation (PDF)

· A paper describing the algorithm and the software implementation (PDF)

· Open source software written in a widely available high-level 
programming language. The software should be
submitted in a gzipped .tar archive with a README file describing the contents 
of the archive. Each submission should
include documentation, examples of the use of the software, a test program, and 
scripts for executing the test programs.





*Previous recipients:*



Previous recipients of the James H. Wilkinson Prize for Numerical Software are:



*2015*Patrick Farrell, Simon Funke, David Ham, and Marie Rognes for 
dolfin-adjoint
*2011 *Andreas Waechter and Carl Laird for IPOPT

*2007 *Wolfgang Bangerth, Guido Kanschat, and Ralf Hartmann for deal.II

*2003 *Jonathan Shewchuk for Triangle

*1999 *Matteo Frigo and Steven Johnson for FFTW
*1995 *Chris Bischof and Alan Carle for ADIFOR 2.0
*1991 *Linda Petzold for DASSL





*Selection Committee:*

Jorge Moré (Chair), Argonne National Laboratory
Sven Hammarling, Numerical Algorithms Group Ltd and University of Manchester
Michael Heroux, Sandia National Laboratories
Randall J. LeVeque, University of Washington
Katherine Yelick, Lawrence Berkeley National Laboratory



Learn more about our prize program and view all 
prize

[R-pkg-devel] Concern that reverse dependency checking currently unreliable

2018-06-20 Thread J C Nash
For the past few weeks I've been struggling to check a new version of optimx 
that gives
a major upgrade to the 2013 version currently on CRAN and subsumes several 
other packages.

It seems to work fine, and pass Win-builder, R CMD check etc.

However, both devtools and cran reverse dependency checks give multiple errors 
when
run through their respective special commands. Mostly the packages using optimx 
are
failing to install or otherwise stop (wrong testthat version etc.). When I check
manually (if package installs) that package XXX passes R CMD check with new
optimx installed, it invariably does. The automated checks -- given that optimx
is widely used -- take several hours per try.

Since it "fails" the cran reverse check, I'm not getting by the pre-test for 
cran
submission. CRAN maintainers have not responded, likely because they are 
swamped.

So I'm wondering if there is something amiss with either the checking tools or,
more likely, that the upgrade to R 3.5 has put some wrinkles in the package 
collection,
and, in particular, what I should do to provide the new package. I am rather
reluctant to join the growing crowd on github, but may have to if CRAN cannot
be updated.

Ideas welcome.

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Reverse dependencies - again

2018-07-11 Thread J C Nash
Excuses for the length of this, but optimx has a lot of packages
using it.

Over the past couple of months, I have been struggling with checking
a new, and much augmented, optimx package. It offers some serious
improvements:
  - parameter scaling on all methods
  - two safeguarded Newton methods
  - improved gradient approximations
  - more maintainable structure for adding new solvers in future
  - a single method "solver" (wrapper) that uses optim() syntax

However, I have been going through a lot of bother with reverse
dependency checking. In summary, tools::check_packages and
devtools::revdep_check both give lots of complaints, in particular
about packages that cannot be installed. When I run my own checks
(see below), I get no issues that reflect on my own package. Similarly,
Duncan Murdoch was very helpful and ran a check on an earlier version
with no major issues using his own check program. I have noticed,
or been told, that several other workers have their own procedures
and that they have experienced somewhat similar difficulties.

Unfortunately, my last effort to submit to CRAN got blocked at the
pre-scan stage because of reverse dependencies. I have not been able
to fix what I cannot find, however, as it appears I am getting caught
on an obstacle outside my control. I did not get a response from the
CRAN team, but the timing was in the middle of the R3.5 launch, so
understandable. However, I am very reluctant to use submission to
CRAN as a checking tool.

My questions:

1) Am I missing something in my method of calling tools or devtools?
The code is included below. Or not reading the output carefully?
I will be happy to put detail on a website -- too large for here.

2) Is it time to consider an effort to provide online revdep checking
that would avoid pressure on CRAN team directly and would provide
clearer indicators of the issues raised by a particular package? I'd
be happy to assist in such an effort, as it appears to be needed, and
with appropriate output and links could aid developers to improve
their packages.

Cheers,

John Nash

tools::check_packages_in_dir finds install fail for

BioGeoBEARS
CensSpatial
Countr
ecd
ldhmm
LifeHist
macc
marked
midasr
QuantumClone
spfrontier
surrosurv

Code:

# cranrevdep

require(tools)
pkgdir <- "~/temp/wrkopt/srcpkg"
jcheck<-check_packages_in_dir(pkgdir,
  check_args = c("--as-cran", ""),
  reverse = list(repos = getOption("repos")["CRAN"]))
summary(jcheck)

-

devtools::revdep_check finds install fail for:

afex
IRTpp
lme4

as well as

BioGeoBEARS
CensSpatial
Countr
ecd
ldhmm
LifeHist
macc
marked
midasr
QuantumClone
spfrontier
surrosurv

Summary:
Saving check results to `revdep/check.rds` 
---
Cleaning up 
--
* Failed to install: afex, BioGeoBEARS, CensSpatial, Countr, ecd, IRTpp, ldhmm, 
LifeHist, lme4, macc, marked, midasr,
QuantumClone, spfrontier, surrosurv
* ACDm: checking compilation flags used ... WARNING
* languageR: checking Rd cross-references ... WARNING
* mvord: checking compilation flags used ... WARNING
* RandomFields: checking compilation flags used ... WARNING
* rankdist: checking compilation flags used ... WARNING
* regsem: checking compilation flags used ... WARNING


21 packages with problems

|package  |version | errors| warnings| notes|
|:|:---|--:|:|-:|
|ACDm |1.0.4   |  0|1| 1|
|afex |0.21-2  |  1|0| 0|
|BioGeoBEARS  |0.2.1   |  1|0| 0|
|CensSpatial  |1.3 |  1|0| 0|
|Countr   |3.4.1   |  1|0| 0|
|ecd  |0.9.1   |  1|0| 0|
|IRTpp|0.2.6.1 |  1|0| 0|
|languageR|1.4.1   |  0|1| 4|
|ldhmm|0.4.5   |  1|0| 0|
|LifeHist |1.0-1   |  1|0| 0|
|lme4 |1.1-17  |  1|0| 0|
|macc |1.0.1   |  1|0| 0|
|marked   |1.2.1   |  1|0| 0|
|midasr   |0.6 |  1|0| 0|
|mvord|0.3.1   |  0|1| 0|
|QuantumClone |1.0.0.6 |  1|0| 0|
|RandomFields |3.1.50  |  0|1| 2|
|rankdist |1.1.3   |  0|1| 0|
|regsem   |1.1.2   |  0|1| 0|
|spfrontier   |0.2.3   |  1|0| 0|
|surrosurv|1.1.24  |  1|0| 0|


But there are 43 dependencies, so must we assume rest are OK?
> require(devtools)
Loading required package: devtools
> rdlist <- revdep()
> rdlist
 [1] "ACDm"  "afex"  "bbmle"
 [4] "BioGeoBEARS"   "calibrar"  "CatDyn"
 [7] "CensSpatial"   "CJAMP" "Countr"
[10] "dimRed""ecd"   

Re: [R-pkg-devel] can't reproduce cran-pretest error

2018-07-26 Thread J C Nash
I think several of us have had similar issues lately. You might have seen my 
posts on reverse dependencies.
It seems there are some sensitivities in the CRAN test setup, though I think 
things are improving.

Last week I submitted optimx again. I don't think I changed anything but the 
date and some commentary
in documentation files. The pre-test was much better than before, but still had 
two "complaints". One of
these was an "ERROR" of the type "non-convergence -- please choose a different 
minimizer" (in mvord)
and the other was a "WARNING" since my new package (optimx) subsumes the 
functionality of several other
packages, including optextras, so the functions from that were now masked, as 
might be expected. This
was in surrosurv.

However, I did follow protocol and report these false positives, but have had 
no response from CRAN
team. Nor any response to previous msgs about 2 months ago. I suspect some 
volunteer overload, but
if that is the case, I would hope the CRAN team would ask for help. I know 
there are several of us
willing to help if we can. And the new functionality does fix some actual bugs, 
as well as providing
improvements. Without renewal, particularly for infrastructure packages, R will 
decay.

Cheers, JN



On 2018-07-26 03:11 PM, Brad Eck wrote:
> Dear list,
> 
> I'm having trouble reproducing errors from CRAN's pretests.
> 
> I have a package on CRAN called epanet2toolkit that provides R bindings
> to a legacy simulation engine written in C.  So far I've released two
> versions
> to CRAN without trouble.  Now I'm making a third release, principally to
> include
> a citation for the package, but also to clean up warnings raised by new
> compilers.
> 
> My latest submission fails the CRAN pretests for Debian with errors in the
> examples and tests:
> https://win-builder.r-project.org/incoming_pretest/
> epanet2toolkit_0.3.0_20180726_102947/Debian/00check.log
> 
> For what it's worth, the package checks fine under R-3.4.4, R-3.5.0 and
> R-devel
> r74997 (2018-07-21) and r74923 (2018-06-20).
> 
> However, when I run the debian-r-devel checks locally (albeit in Docker) I
> get
> a couple of warnings, but no errors. Since I can't reproduce the error, it's
> difficult to fix. See below the relevant lines of 00check.log:
> 
> * using log directory ‘/pkg/epanet2toolkit.Rcheck’
> * using R Under development (unstable) (2018-07-25 r75006)
> * using platform: x86_64-pc-linux-gnu (64-bit)
> * using session charset: UTF-8
> * using option ‘--as-cran’
> * checking for file ‘epanet2toolkit/DESCRIPTION’ ... OK
> * checking extension type ... Package
> ...
> * checking whether package ‘epanet2toolkit’ can be installed ... WARNING
> Found the following significant warnings:
>   text.h:421:9: warning: ‘KwKw  /d’ directive writing
> 30 bytes into a region of size between 23 and 278 [-Wformat-overflow=]
> See ‘/pkg/epanet2toolkit.Rcheck/00install.out’ for details.
> ...
> * checking compilation flags used ... WARNING
> Compilation used the following non-portable flag(s):
>   ‘-Wdate-time’ ‘-Werror=format-security’ ‘-Wformat’
> * checking compiled code ... OK
> * checking examples ... OK
> * checking for unstated dependencies in ‘tests’ ... OK
> * checking tests ... OK
>   Running ‘testthat.r’
> * checking PDF version of manual ... OK
> * DONE
> Status: 2 WARNINGs, 1 NOTE
> 
> 
> Thanks in advance for any insights.
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] CRAN incoming queue closed from Sep 1 to Sep 9

2018-08-14 Thread J C Nash
Will pending queries to CRAN-submissions about false positives in the check 
process be cleared
first, or left pending? I've been waiting quite a while re: new optimx package,
which has 1 so-called "ERROR" (non-convergence, please use a different method 
msg)
and 1 WARNING because new optimx subsumes optextras.

An alternative I can accept is an invite to resubmit after the reset, but those
of us in limbo do need to know to avoid time wasting on all sides.

Best, JN


On 2018-08-14 10:36 AM, Hadley Wickham wrote:
> Does this include automatically (bot) accepted submissions?
> Hadley
> On Tue, Aug 14, 2018 at 8:07 AM Uwe Ligges
>  wrote:
>>
>> Dear package developers,
>>
>> the CRAN incoming queue will be closed from Sep 1 to Sep 9. Hence
>> package submissions are only possible before and after that period.
>>
>> Best,
>> Uwe Ligges
>> (for the CRAN team)
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 
> 
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Trying to work around missing functionality

2018-08-27 Thread J C Nash
In order to track progress of a variety of rootfinding or optimization
routines that don't report some information I want, I'm using the
following setup (this one for rootfinding).

TraceSetup <- function(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA){
# JN: Define globals here
   groot<-list(ifn=ifn, igr=igr, ftrace=ftrace, fn=fn, gr=gr, label="none")
   envroot <<- list2env(groot) # Note globals in FnTrace
   ## This generates a NOTE that
   ## TraceSetup: no visible binding for '<<-' assignment to ‘envroot’
##   envroot<-list2env(groot, parent=.GlobalEnv) # Note globals in FnTrace -- 
this does NOT work
   ## utils::globalVariables("envroot") # Try declaring here -- causes errors
# end globals
   envroot
}

FnTrace <- function(x,...) {
  # Substitute function to call when rootfinding
  # Evaluate fn(x, ...)
val <- envroot$fn(x, ...)
envroot$ifn <- envroot$ifn + 1 # probably more efficient ways
if (envroot$ftrace) {
   cat("f(",x,")=",val," after ",envroot$ifn," ",envroot$label,"\n")
}
val
}


Perhaps there are better ways to do this, but this does seem to work quite well.
It lets me call a rootfinder with FnTrace and get information on evaluations of 
fn().
(There's another gr() routine, suppressed here.)

However, R CMD check gives a NOTE for

  TraceSetup: no visible binding for global variable ‘envroot’
  Undefined global functions or variables:
envroot

The commented lines in TraceSetup suggest some of the things I've tried. 
Clearly I don't
fully comprehend how R is grinding up the code, but searches on the net seem to 
indicate
I am far from alone. Does anyone have any suggestion of a clean way to avoid 
the NOTE?

JN

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Trying to work around missing functionality

2018-08-27 Thread J C Nash
Unfortunately, makes things much worse. I'd tried something like this already.

> * checking examples ... ERROR
> Running examples in ‘rootoned-Ex.R’ failed
> The error most likely occurred in:
> 
>> ### Name: rootwrap
>> ### Title: zeroin: Find a single root of a function of one variable within
>> ###   a specified interval.
>> ### Aliases: rootwrap
>> ### Keywords: root-finding
>> 
>> ### ** Examples
>> 
>> # Dekker example
>> # require(rootoned)
>> dek <- function(x){ 1/(x-3) - 6 }
>> r1 <- rootwrap(dek, ri=c(3.001, 6), ftrace=TRUE, method="uniroot")
> Error in registerNames(names, package, ".__global__", add) : 
>   The namespace for package "rootoned" is locked; no changes in the global 
> variables list may be made.
> Calls: rootwrap -> TraceSetup ->  -> registerNames
> Execution halted

Also had to use utils::globalVariables( ...

JN


On 2018-08-27 08:40 PM, Richard M. Heiberger wrote:
> Does this solve the problem?
> 
> if (getRversion() >= '2.15.1')
>   globalVariables(c('envroot'))
> 
> I keep this in file R/globals.R
> 
> I learned of this from John Fox's use in Rcmdr.
> 
> On Mon, Aug 27, 2018 at 8:28 PM, J C Nash  wrote:
>> In order to track progress of a variety of rootfinding or optimization
>> routines that don't report some information I want, I'm using the
>> following setup (this one for rootfinding).
>>
>> TraceSetup <- function(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA){
>> # JN: Define globals here
>>groot<-list(ifn=ifn, igr=igr, ftrace=ftrace, fn=fn, gr=gr, label="none")
>>envroot <<- list2env(groot) # Note globals in FnTrace
>>## This generates a NOTE that
>>## TraceSetup: no visible binding for '<<-' assignment to ‘envroot’
>> ##   envroot<-list2env(groot, parent=.GlobalEnv) # Note globals in FnTrace 
>> -- this does NOT work
>>## utils::globalVariables("envroot") # Try declaring here -- causes errors
>> # end globals
>>envroot
>> }
>>
>> FnTrace <- function(x,...) {
>>   # Substitute function to call when rootfinding
>>   # Evaluate fn(x, ...)
>> val <- envroot$fn(x, ...)
>> envroot$ifn <- envroot$ifn + 1 # probably more efficient ways
>> if (envroot$ftrace) {
>>cat("f(",x,")=",val," after ",envroot$ifn," ",envroot$label,"\n")
>> }
>> val
>> }
>>
>>
>> Perhaps there are better ways to do this, but this does seem to work quite 
>> well.
>> It lets me call a rootfinder with FnTrace and get information on evaluations 
>> of fn().
>> (There's another gr() routine, suppressed here.)
>>
>> However, R CMD check gives a NOTE for
>>
>>   TraceSetup: no visible binding for global variable ‘envroot’
>>   Undefined global functions or variables:
>> envroot
>>
>> The commented lines in TraceSetup suggest some of the things I've tried. 
>> Clearly I don't
>> fully comprehend how R is grinding up the code, but searches on the net seem 
>> to indicate
>> I am far from alone. Does anyone have any suggestion of a clean way to avoid 
>> the NOTE?
>>
>> JN
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Trying to work around missing functionality

2018-08-28 Thread J C Nash
Indeed, it appears that globalVariables must be outside the function. However, 
I had quite a bit of
fiddle to get things to work without errors or warnings or notes. While I now 
have a package that
does not complain with R CMD check, I am far from satisfied that I can give a 
prescription. I had
to remove lines in the rootfinder like
   envroot$fn <- fn
that were used to set the function to be used inside my instrumented function, 
and instead
call TraceSetup(fn=fn, ...) where a similar statement was given. Why that 
worked while the direct
assignment (note, not a <<- one) did not, I do not understand. However, I will 
work with this for
a while and try to get a better handle on it.

Thanks for the pointer. As an old-time programmer from days when you even set 
the add table, I'm
still uncomfortable just putting code in a directory and assuming it will be 
executed, i.e., the
globals.R file. However, I now have this set to establish the global structure 
as follows

> ## Put in R directory. 
> if(getRversion() >= "2.15.1") { utils::globalVariables(c('envroot')) } # Try 
> declaring here 
> groot<-list(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA, label="none")
> envroot <- list2env(groot) # Note globals in FnTrace

Then TraceSetup() is

> TraceSetup <- function(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA){
>envroot$ifn <- ifn
>envroot$igr <- igr
>envroot$ftrace <- ftrace
>envroot$fn <- fn
>envroot$gr <- gr
>return()
> }

and it is called at the start of the rootfinder routine.

Thus I am establishing a global, then (re-)setting values in TraceSetup(), then
incrementing counters etc. in the instrumented FnTrace() that is the function 
for which I find
the root, which calls fn() given by the "user". Messy, but I can now track 
progress and measure
effort.

I'm sure there are cleaner solutions. I suggest offline discussion would be 
better until such
options are clearer.

Thanks again.

JN



On 2018-08-28 12:01 AM, Fox, John wrote:
> Hi John,
> 
> It's possible that I didn’t follow what you did, but it appears as if you 
> call globalVariables() *inside* the function. Instead try to do as Richard 
> Heiberger suggested and place the call outside of the function, e.g., in a 
> source file in the package R directory named globals.R. (Of course, the name 
> of the source file containing the command isn’t important.)
> 
> I hope this helps,
>  John
> 
> -
> John Fox
> Professor Emeritus
> McMaster University
> Hamilton, Ontario, Canada
> Web: https://socialsciences.mcmaster.ca/jfox/
> 
> 
> 
>> -Original Message-
>> From: R-package-devel [mailto:r-package-devel-boun...@r-project.org] On
>> Behalf Of J C Nash
>> Sent: Monday, August 27, 2018 8:44 PM
>> To: Richard M. Heiberger 
>> Cc: List r-package-devel 
>> Subject: Re: [R-pkg-devel] Trying to work around missing functionality
>>
>> Unfortunately, makes things much worse. I'd tried something like this 
>> already.
>>
>>> * checking examples ... ERROR
>>> Running examples in ‘rootoned-Ex.R’ failed The error most likely
>>> occurred in:
>>>
>>>> ### Name: rootwrap
>>>> ### Title: zeroin: Find a single root of a function of one variable within
>>>> ###   a specified interval.
>>>> ### Aliases: rootwrap
>>>> ### Keywords: root-finding
>>>>
>>>> ### ** Examples
>>>>
>>>> # Dekker example
>>>> # require(rootoned)
>>>> dek <- function(x){ 1/(x-3) - 6 }
>>>> r1 <- rootwrap(dek, ri=c(3.001, 6), ftrace=TRUE,
>>>> method="uniroot")
>>> Error in registerNames(names, package, ".__global__", add) :
>>>   The namespace for package "rootoned" is locked; no changes in the global
>> variables list may be made.
>>> Calls: rootwrap -> TraceSetup ->  -> registerNames
>>> Execution halted
>>
>> Also had to use utils::globalVariables( ...
>>
>> JN
>>
>>
>> On 2018-08-27 08:40 PM, Richard M. Heiberger wrote:
>>> Does this solve the problem?
>>>
>>> if (getRversion() >= '2.15.1')
>>>   globalVariables(c('envroot'))
>>>
>>> I keep this in file R/globals.R
>>>
>>> I learned of this from John Fox's use in Rcmdr.
>>>
>>> On Mon, Aug 27, 2018 at 8:28 PM, J C Nash 
>> wrote:
>>>> In order to track progress of a variety of rootfinding or
>>>> optimization routines that don

Re: [R-pkg-devel] Trying to work around missing functionality

2018-08-28 Thread J C Nash
Thanks for this. Also Duncan's description of how code in the R directory is 
executed.

I've more or less figured out a workaround. Unfortunately Georgi's solution 
doesn't quite
do the trick. Here is my current understanding and solution.

Issue: I want to get root of a function of 1 parameter x, but there may be 
exogenous data Xdata.
   I also want to count the evaluations.
   Some rootfinders don't have counter and some don't allow "..."

Initial solution: Create global envroot with counters and such. Put name of 
function and
gradient in there, then use a dummy FnTrace (and possibly grTrace). This gave 
various
check complaints about globals etc. However, does appear to work.

Present approach: Slightly less flexible, but no complaints.
   Within rootwrap() which calls different rootfinders according to 
method="name", define
   FnTrace and grTrace, set up a list glist for the items I want to share, then
   envroot <- list2env(glist)

The FnTrace and grTrace are defined before the calls to rootfinders, so envroot 
can be found.
No globals. R CMD check is happy. However, I must call rootfinders via the 
wrapper, which is
actually simpler from point of view of syntax.

I've still some testing and tweaking, but I think main issues resolved by this.

Thanks to all who responded.

JN







On 2018-08-28 12:27 PM, Georgi Boshnakov wrote:
> If you don't insist on putting the variable in the global environment, 
> variations of the following give a cleaner solution:
> 
> TraceSetup_1 <- local({
> ifn = 0
> igr = 0
> ftrace = FALSE
> fn = NA
> gr = NA
> 
> function(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA){
> ifn<<- ifn
> igr<<- igr
> ftrace <<- ftrace
> fn <<- fn
> gr <<- gr
> parent.env(environment())
> }
> })
> 
> For example,
> 
> TraceSetup_1 <- local({
> + ifn = 0
> + igr = 0
> + ftrace = FALSE
> + fn = NA
> + gr = NA
> + function(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA){
> + ifn<<- ifn
> + igr<<- igr
> + ftrace <<- ftrace
> + fn <<- fn
> + gr <<- gr
> + parent.env(environment())
> + }
> + })
>>
>> e <- TraceSetup_1(fn = function(x) x^2)
>> ls(e)
> [1] "fn" "ftrace" "gr" "ifn""igr"   
>> e$fn
> function(x) x^2
> 
> ## let's change 'fn':
>> e$fn <- function(x) x^4
>> e$fn
> function(x) x^4
> 
> 
> Note that the environment is always the same, so can be accessed from 
> anywhere in your code:
> 
>> e2 <- environment(TraceSetup_1)
>> e2
> 
>> identical(e2, e)
> [1] TRUE
>>
> 
> If you need a new environment every time, a basic setup might be:
> 
> TraceSetup_2 <- local({
> staticVar1 <- NULL
> ## other variables here
> 
> function(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA){
> ## force evaluation of the arguments
> ifn
> igr
> ftrace 
> fn 
> gr 
> environment()
> }
> })
> 
> There is no need for local() here but usually one needs also some static 
> variables.
> Now every call gives a different environment  (but all have the same parent):
> 
> ea <- TraceSetup_2(fn = function(x) x^2 - 2*x + 1)
>> ls(ea)
> [1] "fn" "ftrace" "gr" "ifn""igr"   
>> ea$fn
> function(x) x^2 - 2*x + 1
>>
>> eb <- TraceSetup_2(fn = function(x) x^2 + 1)
>> eb$fn
> function(x) x^2 + 1
>>
>> ## ea$fn is still the same:
>> ea$fn
> function(x) x^2 - 2*x + 1
>>
> 
> Obviously, in this case some further arrangements are  needed for the 
> environments to be made available to the external world.
> 
> Hope this helps,
> Georgi Boshnakov
> 
> 
> -Original Message-
> From: R-package-devel [mailto:r-package-devel-boun...@r-project.org] On 
> Behalf Of J C Nash
> Sent: 28 August 2018 14:18
> To: Fox, John; Richard M. Heiberger
> Cc: List r-package-devel
> Subject: Re: [R-pkg-devel] Trying to work around missing functionality
> 
> Indeed, it appears that globalVariables must be outside the function. 
> However, I had quite a bit of
> fiddle to get things to work without errors or warnings or notes. While I now 
> have a package that
> does not complain with R CMD check, I am far from satisfied that I can give a 
> prescription. I had
> to remove lines in the rootf

[R-pkg-devel] Collaboration request to build R wrapper for C program(s)

2018-12-12 Thread J C Nash
There is a quite well-developed but not terribly large C program
for conjugate gradients and similar approaches to optimization I would
like to wrap in a package for use in R. I would then build this into the
optimx package I maintain. I suspect that the approach may turn out to be
one of the most efficient for large-n problems.

However, my skills with C and C++ are essentially knowing how to mimic
existing code, and I would welcome collaboration or help to build the
package, likely using Rcpp. Possibly this would be a suitable term
project in a stat. computing course. I'm more than happy to share my
expertise on the optimization side, or with other computing languages,
particularly Fortran.

To reduce noise on the list, I'll suggest off-line contact to the
address above (profjcnash _at_ gmail.com).

Best, JN

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] try() in R CMD check --as-cran

2019-06-06 Thread J C Nash
After making a small fix to my optimx package, I ran my usual R CMD check 
--as-cran.

To my surprise, I got two ERRORs unrelated to the change. The errors popped up 
in
a routine designed to check the call to the user objective function. In 
particular,
one check is that the size of vectors is the same in expressions like (x - y)^2.
This works fine with R CMD check, but the --as-cran seems to have changed and it
pops an error, even when the call is inside try(). The irony that the routine in
question is intended to avoid problems like this is not lost on me.

I'm working on a small reproducible example, but it's not small enough yet.
In the meantime, I'm looking for the source codes of the scripts for "R CMD 
check" and
"R CMD check --as-cran" so I can work out why there is this difference, which 
seems
to be recent.

Can someone send/post a link? I plan to figure this out and provide feedback,
as I suspect it is going to affect others. However, it may be a few days or even
weeks if past experience is a guide.

JN

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] try() in R CMD check --as-cran

2019-06-07 Thread J C Nash
Sorry reply not quicker. For some reason I'm not getting anything in the thread 
I started!
I found the responses in the archives. Perhaps cc: nas...@uottawa.ca please.

I have prepared a tiny (2.8K) package at
http://web.ncf.ca/nashjc/jfiles/fchk_2019-6.5.tar.gz

R CMD check --> OK

R CMD check --as-cran --> 1 ERROR, 1 NOTE

The error is in an example:

> benbad<-function(x, y){
># y may be provided with different structures
>f<-(x-y)^2
> } # very simple, but ...
> 
> y<-1:10
> x<-c(1)
> cat("test benbad() with y=1:10, x=c(1)\n")
> tryfc01 <- try(fc01<-fchk(x, benbad, trace=3, y))
> print(tryfc01)
> print(fc01)

There's quite a lot of output, but it doesn't make much sense to me, as
it refers to code that I didn't write.

The function fchk is attempting to check if functions provided for
optimization do not violate some conditions e.g., character rather than
numeric etc.

JN


On 2019-06-07 8:44 a.m., J C Nash wrote:
> Uwe Ligges ||gge@ @end|ng |rom @t@t|@t|k@tu-dortmund@de
> Fri Jun 7 11:44:37 CEST 2019
> 
> Previous message (by thread): [R-pkg-devel] try() in R CMD check --as-cran
> Next message (by thread): [R-pkg-devel] using package data in package code
> Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
> 
> Right, what problem are you talking about? Can you tell us which check
> it is and what it actually complained about.
> There is no check that looks at the sizes of x and y in exypressions
> such as
> (x - y)^2.
> as far as I know.
> 
> Best,
> Uwe
> 
> On 07.06.2019 10:33, Berry Boessenkool wrote:
>>
>> Not entirely sure if this is what you're looking for:
>> https://github.com/wch/r-source/blob/trunk/src/library/tools/R/check.R
>> It does contain --as-cran a few times and there's the change-history:
>> https://github.com/wch/r-source/commits/trunk/src/library/tools/R/check.R
>>
>> Regards,
>> Berry
>>
>>
>> 
>> From: R-package-devel  on 
>> behalf of J C Nash 
>> Sent: Thursday, June 6, 2019 15:03
>> To: List r-package-devel
>> Subject: [R-pkg-devel] try() in R CMD check --as-cran
>>
>> After making a small fix to my optimx package, I ran my usual R CMD check 
>> --as-cran.
>>
>> To my surprise, I got two ERRORs unrelated to the change. The errors popped 
>> up in
>> a routine designed to check the call to the user objective function. In 
>> particular,
>> one check is that the size of vectors is the same in expressions like (x - 
>> y)^2.
>> This works fine with R CMD check, but the --as-cran seems to have changed 
>> and it
>> pops an error, even when the call is inside try(). The irony that the 
>> routine in
>> question is intended to avoid problems like this is not lost on me.
>>
>> I'm working on a small reproducible example, but it's not small enough yet.
>> In the meantime, I'm looking for the source codes of the scripts for "R CMD 
>> check" and
>> "R CMD check --as-cran" so I can work out why there is this difference, 
>> which seems
>> to be recent.
>>
>> Can someone send/post a link? I plan to figure this out and provide feedback,
>> as I suspect it is going to affect others. However, it may be a few days or 
>> even
>> weeks if past experience is a guide.
>>
>> JN
>>
>> __
>> R-package-devel using r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>>
>>  [[alternative HTML version deleted]]
>>
>> __
>> R-package-devel using r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>>
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] try() in R CMD check --as-cran

2019-06-07 Thread J C Nash
Should try() not stop those checks from forcing an error?

I recognize that this is the failure -- it is indeed the check I'm trying to
catch -- but I don't want tests of such checks to fail my package.

JN

On 2019-06-07 9:31 a.m., Sebastian Meyer wrote:
> The failure stated in the R CMD check failure report is:
> 
>>  --- failure: length > 1 in coercion to logical ---
> 
> This comes from --as-cran performing useful extra checks via setting the
> environment variable _R_CHECK_LENGTH_1_LOGIC2_, which means:
> 
>> check if either argument of the binary operators && and || has length 
>> greater than one. 
> 
> (see https://cran.r-project.org/doc/manuals/r-release/R-ints.html#Tools)
> 
> The failure report also states the source of the failure:
> 
>>  --- call from context --- 
>> fchk(x, benbad, trace = 3, y)
>>  --- call from argument --- 
>> is.infinite(fval) || is.na(fval)
> 
> The problem is that both is.infinite(fval) and is.na(fval) return
> vectors of length 10 in your test case:
> 
>>  --- value of length: 10 type: logical ---
>>  [1] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
> 
> The || operator works on length 1 Booleans. Since fval can be of length
> greater than 1 at that point, the proper condition seems to be:
> 
> any(is.infinite(fval)) || any(is.na(fval))
> 
> Best regards,
> 
>   Sebastian
> 
> 
> Am 07.06.19 um 14:53 schrieb J C Nash:
>> Sorry reply not quicker. For some reason I'm not getting anything in the 
>> thread I started!
>> I found the responses in the archives. Perhaps cc: nas...@uottawa.ca please.
>>
>> I have prepared a tiny (2.8K) package at
>> http://web.ncf.ca/nashjc/jfiles/fchk_2019-6.5.tar.gz
>>
>> R CMD check --> OK
>>
>> R CMD check --as-cran --> 1 ERROR, 1 NOTE
>>
>> The error is in an example:
>>
>>> benbad<-function(x, y){
>>># y may be provided with different structures
>>>f<-(x-y)^2
>>> } # very simple, but ...
>>>
>>> y<-1:10
>>> x<-c(1)
>>> cat("test benbad() with y=1:10, x=c(1)\n")
>>> tryfc01 <- try(fc01<-fchk(x, benbad, trace=3, y))
>>> print(tryfc01)
>>> print(fc01)
>>
>> There's quite a lot of output, but it doesn't make much sense to me, as
>> it refers to code that I didn't write.
>>
>> The function fchk is attempting to check if functions provided for
>> optimization do not violate some conditions e.g., character rather than
>> numeric etc.
>>
>> JN
>>
>>
>> On 2019-06-07 8:44 a.m., J C Nash wrote:
>>> Uwe Ligges ||gge@ @end|ng |rom @t@t|@t|k@tu-dortmund@de
>>> Fri Jun 7 11:44:37 CEST 2019
>>>
>>> Previous message (by thread): [R-pkg-devel] try() in R CMD check 
>>> --as-cran
>>> Next message (by thread): [R-pkg-devel] using package data in package 
>>> code
>>> Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
>>>
>>> Right, what problem are you talking about? Can you tell us which check
>>> it is and what it actually complained about.
>>> There is no check that looks at the sizes of x and y in exypressions
>>> such as
>>> (x - y)^2.
>>> as far as I know.
>>>
>>> Best,
>>> Uwe
>>>
>>> On 07.06.2019 10:33, Berry Boessenkool wrote:
>>>>
>>>> Not entirely sure if this is what you're looking for:
>>>> https://github.com/wch/r-source/blob/trunk/src/library/tools/R/check.R
>>>> It does contain --as-cran a few times and there's the change-history:
>>>> https://github.com/wch/r-source/commits/trunk/src/library/tools/R/check.R
>>>>
>>>> Regards,
>>>> Berry
>>>>
>>>>
>>>> 
>>>> From: R-package-devel  on 
>>>> behalf of J C Nash 
>>>> Sent: Thursday, June 6, 2019 15:03
>>>> To: List r-package-devel
>>>> Subject: [R-pkg-devel] try() in R CMD check --as-cran
>>>>
>>>> After making a small fix to my optimx package, I ran my usual R CMD check 
>>>> --as-cran.
>>>>
>>>> To my surprise, I got two ERRORs unrelated to the change. The errors 
>>>> popped up in
>>>> a routine designed to check the call to the user objective function. In 
>>>> particular,
>>>> one check is that the size of vectors is the same in expressions like (x - 
>>>> y)^2

Re: [R-pkg-devel] try() in R CMD check --as-cran

2019-06-07 Thread J C Nash
Serguei picked up the glitch and Jeff sorted out the || vs | once any()
was used.

The test that caused the issue was not the one I was looking for,
but another case. However, I'd overlooked the possibility that
there could be different lengths, so || complained (as it should,
but didn't in regular R CMD check).

As an aside, the --as-cran check complains as follows:

The The Title field should be in title case. Current version is:
‘A test of R CMD check --as-cran’
In title case that is:
‘A Test of R CMD Check --as-Cran’

Thanks to all.

JN


On 2019-06-07 10:05 a.m., Jeff Newmiller wrote:
>>> any(is.infinite(fval)) || any(is.na(fval))
>>
>> a little typo here: it should be '|', not '||', right ?
> 
> Since `any` collapses the vectors to length 1 either will work, but I would 
> prefer `||`.
> 
> On June 7, 2019 6:51:29 AM PDT, Serguei Sokol  wrote:
>> On 07/06/2019 15:31, Sebastian Meyer wrote:
>>> The failure stated in the R CMD check failure report is:
>>>
>>>>   --- failure: length > 1 in coercion to logical ---
>>>
>>> This comes from --as-cran performing useful extra checks via setting
>> the
>>> environment variable _R_CHECK_LENGTH_1_LOGIC2_, which means:
>>>
>>>> check if either argument of the binary operators && and || has
>> length greater than one.
>>>
>>> (see
>> https://cran.r-project.org/doc/manuals/r-release/R-ints.html#Tools)
>>>
>>> The failure report also states the source of the failure:
>>>
>>>>   --- call from context ---
>>>> fchk(x, benbad, trace = 3, y)
>>>>   --- call from argument ---
>>>> is.infinite(fval) || is.na(fval)
>>>
>>> The problem is that both is.infinite(fval) and is.na(fval) return
>>> vectors of length 10 in your test case:
>>>
>>>>   --- value of length: 10 type: logical ---
>>>>   [1] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
>>>
>>> The || operator works on length 1 Booleans. Since fval can be of
>> length
>>> greater than 1 at that point, the proper condition seems to be:
>>>
>>> any(is.infinite(fval)) || any(is.na(fval))
>> a little typo here: it should be '|', not '||', right ?
>>
>> Best,
>> Serguei.
>>
>>> Am 07.06.19 um 14:53 schrieb J C Nash:
>>>> Sorry reply not quicker. For some reason I'm not getting anything in
>> the thread I started!
>>>> I found the responses in the archives. Perhaps cc: nas...@uottawa.ca
>> please.
>>>>
>>>> I have prepared a tiny (2.8K) package at
>>>> http://web.ncf.ca/nashjc/jfiles/fchk_2019-6.5.tar.gz
>>>>
>>>> R CMD check --> OK
>>>>
>>>> R CMD check --as-cran --> 1 ERROR, 1 NOTE
>>>>
>>>> The error is in an example:
>>>>
>>>>> benbad<-function(x, y){
>>>>> # y may be provided with different structures
>>>>> f<-(x-y)^2
>>>>> } # very simple, but ...
>>>>>
>>>>> y<-1:10
>>>>> x<-c(1)
>>>>> cat("test benbad() with y=1:10, x=c(1)\n")
>>>>> tryfc01 <- try(fc01<-fchk(x, benbad, trace=3, y))
>>>>> print(tryfc01)
>>>>> print(fc01)
>>>>
>>>> There's quite a lot of output, but it doesn't make much sense to me,
>> as
>>>> it refers to code that I didn't write.
>>>>
>>>> The function fchk is attempting to check if functions provided for
>>>> optimization do not violate some conditions e.g., character rather
>> than
>>>> numeric etc.
>>>>
>>>> JN
>>>>
>>>>
>>>> On 2019-06-07 8:44 a.m., J C Nash wrote:
>>>>> Uwe Ligges ||gge@ @end|ng |rom @t@t|@t|k@tu-dortmund@de
>>>>> Fri Jun 7 11:44:37 CEST 2019
>>>>>
>>>>>  Previous message (by thread): [R-pkg-devel] try() in R CMD
>> check --as-cran
>>>>>  Next message (by thread): [R-pkg-devel] using package data in
>> package code
>>>>>  Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
>>>>>
>>>>> Right, what problem are you talking about? Can you tell us which
>> check
>>>>> it is and what it actually complained about.
>>>>> There is no check that looks at the sizes of x and y in
>> exypressions
>>>>> such 

Re: [R-pkg-devel] try() in R CMD check --as-cran

2019-06-07 Thread J C Nash
I've put my tiny package back up, but it likely is doing same
thing as Bill Dunlap's. I took it off as a cleanup when I thought
the issue resolved.

http://web.ncf.ca/fh448/jfiles/fchk_2019-6.5.tar.gz

Seems I've disturbed the ant's nest.

JN



On 2019-06-07 1:53 p.m., William Dunlap wrote:
> I've attached a package, ppp_0.1.tar.gz, which probably will not get through 
> to R-help, that illustrates this.
> It contains one function which, by default, triggers a condition-length>1 
> issue:
>    f <- function(x = 1:3)
>    {
>        if (x > 1) {
>            x <- -x
>        }
>        stop("this function always gives an error")
>    }
> and the help file example is
>    try(f())
> 
> Then 
>    env _R_CHECK_LENGTH_1_CONDITION_=abort,verbose R-3.6.0 CMD check --as-cran 
> ppp_0.1.tar.gz
> results in
> * checking examples ... ERROR
> Running examples in ‘ppp-Ex.R’ failed
> The error most likely occurred in:
> 
>> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
>> ### Name: f
>> ### Title: Cause an error
>> ### Aliases: f
>> ### Keywords: error
>>
>> ### ** Examples
>>
>> try(f())
>  --- FAILURE REPORT --
>  --- failure: the condition has length > 1 ---
>  --- srcref ---
> :
>  --- package (from environment) ---
> ppp
>  --- call from context ---
> f()
>  --- call from argument ---
> if (x > 1) {
>     x <- -x
> }
>  --- R stacktrace ---
> where 1: f()
> where 2: doTryCatch(return(expr), name, parentenv, handler)
> where 3: tryCatchOne(expr, names, parentenv, handlers[[1L]])
> where 4: tryCatchList(expr, classes, parentenv, handlers)
> where 5: tryCatch(expr, error = function(e) {
>     call <- conditionCall(e)
>     if (!is.null(call)) {
>         if (identical(call[[1L]], quote(doTryCatch)))
>             call <- sys.call(-4L)
>         dcall <- deparse(call)[1L]
>         prefix <- paste("Error in", dcall, ": ")
>         LONG <- 75L
>         sm <- strsplit(conditionMessage(e), "\n")[[1L]]
>         w <- 14L + nchar(dcall, type = "w") + nchar(sm[1L], type = "w")
>         if (is.na <http://is.na>(w))
>             w <- 14L + nchar(dcall, type = "b") + nchar(sm[1L],
>                 type = "b")
>         if (w > LONG)
>             prefix <- paste0(prefix, "\n  ")
>     }
>     else prefix <- "Error : "
>     msg <- paste0(prefix, conditionMessage(e), "\n")
>     .Internal(seterrmessage(msg[1L]))
>     if (!silent && isTRUE(getOption("show.error.messages"))) {
>         cat(msg, file = outFile)
>         .Internal(printDeferredWarnings())
>     }
>     invisible(structure(msg, class = "try-error", condition = e))
> })
> where 6: try(f())
> 
>  --- value of length: 3 type: logical ---
> [1] FALSE  TRUE  TRUE
>  --- function from context ---
> function (x = 1:3)
> {
>     if (x > 1) {
>         x <- -x
>     }
>     stop("this function always gives an error")
> }
> 
> 
>  --- function search by body ---
> Function f in namespace ppp has this body.
>  --- END OF FAILURE REPORT --
> Fatal error: the condition has length > 1
> * checking PDF version of manual ... OK
> * DONE
> 
> Status: 1 ERROR, 1 NOTE
> See
>   ‘/tmp/bill/ppp.Rcheck/00check.log’
> for details.
> Bill Dunlap
> TIBCO Software
> wdunlap tibco.com <http://tibco.com>
> 
> 
> On Fri, Jun 7, 2019 at 10:21 AM Duncan Murdoch  <mailto:murdoch.dun...@gmail.com>> wrote:
> 
> On 07/06/2019 12:32 p.m., William Dunlap wrote:
> > The length-condition-not-equal-to-one checks will cause R to shutdown
> > even if the code in a tryCatch().
> 
> That's strange.  I'm unable to reproduce it with my tries, and John's
> package is no longer online.  Do you have an example I could look at?
> 
> Duncan Murdoch
> 
> >
> > Bill Dunlap
> > TIBCO Software
> > wdunlap tibco.com <http://tibco.com> <http://tibco.com>
> >
> >
> > On Fri, Jun 7, 2019 at 7:47 AM Duncan Murdoch  <mailto:murdoch.dun...@gmail.com>
> > <mailto:murdoch.dun...@gmail.com <mailto:murdoch.dun...@gmail.com>>> 
> wrote:
> >
> >     On 07/06/2019 9:46 a.m., J C Nash wrote:
> >      > Should try() not stop those checks from forcing an error?
> >
> >     try(stop("msg"))  will print the error messag

[R-pkg-devel] General considerations about vignettes

2019-08-30 Thread J C Nash
I'm seeking some general advice about including vignettes in my packages,
which are largely for nonlinear estimation and function minimization 
(optimization).
This means that my packages offer alternatives to many other tools, and the user
then has the chore of deciding which is appropriate. Bad choices can be very
costly in inappropriate results or computational efficiencies. Hence, I include
vignettes to offer comparisons and examples of use.

Unfortunately, as in a case this week, changes in the comparison packages break
my package(s), and I get an email from CRAN telling me to fix it before some
date not far in the future. This means a) work for me, possibly at an 
inopportune
time; b) risk of loss of capability, in the present case in the nlsr package 
which
offers some unique capabilities, and c) extra work for CRAN for what is, 
arguably,
updating of peripheral documentation. Updating optimization packages on CRAN 
can be,
I have discovered, a very time-consuming task. Package optimx took over 3 months
to get updated.

It should be noted in the present situation that just before I got the msg from
CRAN I got a msg from the maintainer of the package that has changed and breaks
the vignette with some suggestions on a fix. The issue is that his package has
changed function syntax -- a situation all of us know is fraught with troubles,
since improvements may cause breakage.

I am NOT saying that my vignettes should not be updated. However, I'm wondering
if I should set up a repository for my vignettes on Github/Gitlab or similar, 
and
simply link to them. This would separate the updating of vignettes from the 
central
packages. Their updating could be less strictly tied to CRAN activities, and 
could
also be a task undertaken by others who are not listed as maintainer.

I'd welcome some (hopefully constructive) comments. Would CRAN maintainers feel
this to be helpful, or does it lower the value of official R packages? Do
other maintainers experience the same requests, or do they just not include
vignettes (and many do not)?


John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Anyone Know How To Setup Wine for Windows Testing?

2020-07-15 Thread J C Nash
Are you sure you want to try to run R etc. under Wine?

- If you have Windows running, either directly or in a VM, you can run R there.
- If you have Windows and want to run R under some other OS, then set up a VM
e.g., Linux Mint, for that. I sometimes test R for Windows in a VirtualBox VM
for Win10, but generally run in Linux Mint. I've also run R in some Linux VMs
to test for specific dependencies in some distros.

I rather doubt R will run very well in Linux under Wine. My experience with Wine
is that a few apps (e.g. Irfanview) run well, but many give lots of trouble.

JN


On 2020-07-15 1:17 p.m., Steve Bronder wrote:
> Does anyone know of a setup guide for getting R and Rtools 4.0 up and
> running on Wine with the Windows Server 2008 R2 VM? Do other maintainers
> with more knowhow think that would be useful for debugging purposes?
> 
> I've been trying to test out some flto gcc things for windows by setting up
> a local wine VM on my ubuntu box. Wine has an option for Windows Server
> 2008 R2 (which I believe is the windows session CRAN uses?) If anyone has
> done this before and knows of a guide somewhere that would be very helpful!
> 
> Regards,
> 
> Steve Bronder
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] install.packages() seems not to select the latest suitable version

2020-07-28 Thread J C Nash
Possibly the "old" site-library is not getting over-written. I had to
manually delete.

See https://www.mail-archive.com/r-help@r-project.org/msg259132.html

JN

On 2020-07-28 7:21 a.m., Dirk Eddelbuettel wrote:
> 
> Hi Adelchi,
> 
> On 28 July 2020 at 11:46, Adelchi Azzalini wrote:
> | When I updated package mnormt to version 2.0.0 in June (now at 2.0.1), 
> | at the stage of --as-cran checking, there was a compilation error,  
> | which was overcome by setting the 
> | 
> | Depends:R (≥ 4.0.0)
> | 
> | With this option, all worked fine.
> | 
> | However, shortly afterwards, complaints started coming, 
> | either from users or from maintainers of packages making use of mnormt,
> | because this high version dependence causes troubles to some people,
> | such as those using Debian installations, currently at a much lower 
> | R version.
> 
> You can point those users to a) the r-sig-debian list and b) the Debian
> directory at CRAN as we have always had "backports" of the current R to older
> Debian releases---thanks to the work by Johannes Ranke "backporting" whatever
> my current Debian packages of R are.
> 
> Moreover, you can also point them at `apt install r-cran-mnormt` -- I have
> maintained your package within Debian since 2007 (!!) and continue to do so
> giving Debian (and Ubuntu) users the choice between a distro binary and
> installation from CRAN source. 
>  
> | At the time I select that dependence value, I relied on the fact that
> | install.packages() selected the most recent suitable version of a package,
> | given the existing R installation. I expected that people without
> | R 4.0.0 would have the older version of mnormt, 1.5-7, installed.
> | As my memory goes (and the memory of other people too), this was 
> | the working in the past, but apparently not any more. 
> 
> I don't think that is quite correct. The CRAN repo is always set up for the
> currently released version, and may allow constraints such 'R (>= 4.0.0)'
> imposing the current (major) release.
> 
> There is no recent change in this behavior.
> 
> | For instance, this is a passage from a specific user:
> |  
> | "install.packages() used tp just install the most recent available 
> | for your current version of R.  In the past it might have done just that, 
> | but that's clearly not the case currently."
> 
> Yes and no. I don't think this correctly stated. `install.packages()` always
> picks the most recent version, but this may also require running _the
> current_ R release.  I disagree about "not the case currently" -- no change
> as stated above.
> 
> | Can anyone clarify the reason of this (apparent? real?) change?
> | ...and possibly indicate a way who people with lower R version (and perhaps
> | limited R expertise) can install the older version of mnormt, 1.5-7, 
> | without much hassle?
> 
> "Versioned" installs were never supported by `install.packages()`.
> 
> But one could always download an older version to a local file, and point
> install.packages() at that file (and setting 'repos=NULL'), or use `R CMD
> INSTALL` directly. No change there either.
> 
> Dirk
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Closing a display window from a program

2024-11-13 Thread J C Nash

Hi,

I'm trying to prepare a package to manage some diverse data. It is
extremely convenient to display a PDF derived from the data so the user
can verify information that the R script is going to process (even nicer
if on a 2 monitor system). I've 2 issues:

1) I can display the pdf either with file_show() from the fs package
or browseURL(). Both leave the pdf displayed and return to the
R "terminal". But when I finish processing at the end of the script,
I want to have the display close. This seems to need (non-platform-neutral)
use of a system() call to wmctrl, which is only on X-windows systems.
Note that the display tools don't seem to return a handle that I can use to
close the display window some other way. On my system both the tools
for display loaded xreader (I'm on Linux Mint Wilma).

2) the tools also don't seem to offer window size or placement, which
  would be nice.

Suggestions welcome.

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Closing a display window from a program

2024-11-19 Thread J C Nash

In answer to my own posting about this, I found the following partially
answers what I need.

The task is to display a file (in my case a pdf, for which 'xdisp' contains
the string for the command to do this. See below for the reason this is a
variable. The displayed file has information I will need to possibly enter
into other program structures and which may be in the form of pixels rather
than text or numbers. Later, I wish to close the display when the need is past.
Clearly, that step could be left to the user, but I believe in decluttering
the screen as much as I can.


  cmd<-cmd <- paste0(xdisp," ",xdocfn," &") # add & to keep running and return??
  system(cmd)
  pidcmd<-paste0("pgrep -f '",cmd,"'")
  spid<-system(pidcmd, intern=TRUE) # gets spid, but stays running
  # For some reason increased by 1
  spid<-as.integer(spid)-1

Later on I can close the display with

# Kill the display of the statement now.
  tmp<-readline("Kill the bill display process now")
  kcmd<-paste0("kill -9 ",spid)

Clearly I'm on a Linux system (Mint 22 Wilma) I've not considered how I'd do 
this
in Windows, but would welcome suggestions as I believe cross-platform solutions
widen the utility of any software. I also looked at the CRAN package ps, and 
found
there are ways to use some of its tools. They may, in fact, be more 
system-neutral,
so are still on my radar for consideration.

My posting asked also if the display could be controlled as to position and 
size.
I have not yet found how to do that, but have found choice of the pdf display
program a partial help. I am currently setting xdisp to xpdf in an "ini" file,
for which the CRAN package ini is most helpful.

For information, I put below a solution to an Rstudio matter that arose.

Comments and discussion welcome.

Best,

John Nash

PS. In building this code in Rstudio and using browseURL(file) to open the 
display,
I got a warning error about "signal 10". This can be overcome by using
browseURL(file, browser="xdg-open"). However, "file" must be either a simple
filename of a file in the current directory, or a fully qualified path. Use
of the shortcut "~/" for the home directory fails. This issue does not arise
for R in a terminal.

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Query about broken reverse dependencies that already are broken

2024-12-10 Thread J C Nash

This is a question about how things are done rather than a request for a fix.

Yesterday I re-submitted my optimx package with some small but important fixes
(e.g., one place where one solver would miss catching function evaluation 
limits).
I'd done a revdepcheck that came up with "Wow, no problems", but in fact one of
the revdeps had a test failure which was already flagged in its checks list on
CRAN. A package test example had a singularity on some systems. This can happen 
with
nonlinear function minimization due to very small changes in arithmetic and
approximations of different systems. Putting in checks and "graceful failure"
for such conditions is the ideal for optimization solvers, but it isn't easy.
Some of the changes in the optimx update submitted are of this flavour.

When the submission checks came back this morning there was a
"Changes to worse in reverse depends:", even though there really is no change.
However, I then got a msg "Thanks, on its way to CRAN."

Am I correct in assuming a manual review passed the package (which hopefully I
did get fully compliant)? Or will I get an eventual "please fix" for something
clearly outside my scope of action?

As indicated, at the moment this isn't a request for help, though that may come
later.

Cheers,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Procedure for dealing with reverse dependency that has gone bad on CRAN version of my package

2025-04-09 Thread J C Nash

In correcting a minor bug in the optimx package, I submitted a new version 
today.
The existing CRAN version (2024-12.2) has all CRAN checks OK, and to my 
knowledge
no notifications of bugs. The new version has hit a FAIL with package phenofit 
revdep.
Wanting to have output from a "working" system, I (re-)installed the CRAN 
version
of optimx and ran R CMD check on phenofit, and also get the FAIL on one of its 
tests.

The failure is a solve error (singularity) in the called minimizer nlminb, so 
could be
a result of some alteration in something like the matrix routines. I've also 
seen
changes in tests due to use of M1 and similar processors, relaxed IEEE 
arithmetic
being the cause. Such differences in outcome are not really a surprise, but can
be a nuisance.

My query is how to proceed to resolve this situation. Since my existing package 
--
which was supposedly OK back in December -- also throws the test failure in 
phenofit,
I've no path to reverse changes in my code. It's also possible there's no error
in phenofit, just a test case that now trips up when it did not prior to some
external change.

Still, some resolution is needed, and it would be good to avoid too much fuss.

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Procedure for dealing with reverse dependency that has gone bad on CRAN version of my package

2025-04-10 Thread J C Nash



I've had an email from CRAN team that the revdep failure with phenofit is a 
known issue and
updated optimx package (fixing glitch in computing approximate Hessian when 
objective uses dot args)
is on its way to CRAN.

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Use of long double in configuration

2025-04-30 Thread J C Nash

As one of original 30-some members of 1985 IEEE 754, I find it discouraging that
we are treating long-double as the exception, when it is the introduction of
"short double" in M1 etc chips that have forced the issue. There are strong
commercial reasons, but they aren't computational ones.

JN


On 2025-04-30 05:08, Tim Taylor wrote:

Thank you all!

Everything is clear.

Tim

On Wed, 30 Apr 2025, at 10:07 AM, Tomas Kalibera wrote:

On 4/30/25 10:43, Tim Taylor wrote:

Cheers for the quick response.

To clarify my question: Is it correct to say that as long as packages do not 
assume the greater precision provided by 'double' there is no reason they 
cannot use 'long double' to get *possible* advantages (e.g. in summations). 
AFAICT 'long double' is (and has always been) part of the C standard so it's 
use as a type should be unproblematic (this is the query relevant to 
matrixStats).


Probably already clear from previous answers, but yes, packages can use
long double type.

Whenever using a long double type, one needs to be careful about making
sure the algorithms work, and the tests pass (so have reasonable
tolerances), even when the long double type happens to be just the same
as double. This is the case on aarch64, and macOS/aarch64 is one of the
platforms where packages have to work, anyway, so this shouldn't be too
limiting anymore - but really one needs to test on such platform.

R itself has an option to disable use of long double to make such
testing in R itself possible also on other platforms. In principle one
could do something similar in a package, have some ifdefs to disable
long doubles, but this is not required. And I probably wouldn't do that,
I'd just test on aarch64 regularly.

See Writing R Extensions for more details.

Best
Tomas


Apologies if this does not make much sense.

Tim



On Wed, 30 Apr 2025, at 9:33 AM, Uwe Ligges wrote:

On 30.04.2025 10:25, Tim Taylor wrote:

Is it correct to say that R's conditional use of long double is around ensuring 
things work on platforms which have 'long double' identical to 'double' types, 
as opposed to there being an odd compiler targeted that does not even have any 
concept of 'long double' type?

a double is 64 bit and stored that way on all platforms, the concept of
long doubles is CPU specific. x86 chips have 80bit in the floating point
units for calculations before rounding (and normalizing) to a regular
double.

Some chips, e.g. those ARM chips used in current M[123]Macs (hence very
relevant topic), do not support long doubles. And compilers offer to
compile without support for long doubles which e.g. CRAN uses to check
in an additional (issues) check.

Best,
Uwe Ligges


As background this was motivated by a query raised in the matrixStats package:
https://github.com/HenrikBengtsson/matrixStats/issues/278

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Attachments:
* smime.p7s

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Use of long double in configuration

2025-05-04 Thread J C Nash

By "short double" I simply meant the 64 bit double without the extended 
precision.
Wikipedia has a "long double" article, and is quite good in showing the 
diversity
of interpretations.

I've not spent much time looking at the standards since the 80s, except to note
the move to make the extended bits optional recently. As far as I am aware, any
precision above 64 bit storage but 80 bits in registers required special 
software
and hardware, but the 64/80 combination seemed to be more or less a given until
the M? chips came along.

For a while I recall there were all sorts of things with gate arrays to make
special hardware. They needed special compilers and were a nuisance to use.
Fortunately, not a great many computations require more than 64 bit doubles,
but there are some situations that need the extra precision for some of the 
time.
I remember Kahan saying it was like passing lanes added to two lane highways
to clear out the lines behind a slow truck.

The widespread adoption of IEEE 754 with the 80 bit extended was a great step
forward. For a while most hardware implemented it, but the M? chips don't
have registers, hence the push for changing the standard. It's a pity mechanical
calculators are no longer common, like the Monroe manual I gave away to a
charity last year. They had double width accumulator carriages for a reason,
and make the situation very obvious.

The burden for R developers is that a lot of the behind the screen computations
are done with pretty old codes that very likely were developed with extended
"thinking". Particularly if there are accumulations e.g., of inner products,
we can expect that there will be some surprises. My guess is that there
will need to be some adjustment of the internal routines, most likely to define
somewhat lower standards. My experience has been that results that are a bit
different cause lots of confusion, even if the reality isn't of great import,
and such confusion will be the main waste of time.

I built the "Compact numerical methods" on Data General machines, which had
32 bit floating point with I recall 24 bit mantissa. Later (worse!) the
DG Eclipse used 6 hex digits mantissa. The special functions were pretty
cruddy too. So my codes were incredibly defensive, making them quite reliable
but pretty slow. Also I had to fit program and data in 4K bytes, so the
codes left out bells and whistles. Then we got IEEE 754, which really
helped us out of the bog of weird and not wonderful floating point.
Look up "Fuzz" on Tektronix BASIC. Messed up a lot of my codes.

Cheers,

JN





On 2025-05-04 19:25, Simon Urbanek wrote:

John,

it's sort of the other way around: because neither the implementation, format nor precision of 
"long double" are defined by the C standard (it's not even required to be based on IEEE 
754/IEC 60559 at all), it is essentially left to the compilers+runtimes to do whatever they choose, 
making it a bit of a wild card. Historically, anything beyond double precision was emulated since 
most hardware was unable to natively deal with it, so that’s why you had to think hard if you 
wanted to use it as the penalty could be an order of magnitude or more. It wasn't until Intel’s 
math co-processor and its 80-bit extended precision format which reduced the penalty for such 
operations on that CPU and was mapped to long double - at the cost of results being 
hardware-specific, somewhat arbitrary and only 2.5x precision (while occupying 4x the space). So 
long double is just a simple way to say "do your best" without defining any specifics.

I’m not sure what you mean by "short double" as double precision is defined as 64-bit, so 
"short double" would be simply 32-bit = single precision. More recent introduction of varying 
floating point precisions such as fp8, f16 etc. (assuming that’s what you meant by "short double" 
on M1) are performance and memory usage optimizations for use-cases where the precision is less important 
than memory usage, such as in large NNs. M1 is just one of the modern chips that added co-processors 
specifically for matrix operations on different precisions like fp16, fp32, fp64 - with great performance 
gains (e.g., using AMX via Apple's BLAS/LAPACK with double precision in R on M1 is over 100x faster than the 
CPU-based reference version for some operations).

As for precision beyond doubles, Apple has asked few years ago the scientific 
community whether there is interest in fp128 (quad precision) and the response 
was no, it’s not a priority, so I would assume that’s why it has been left to 
emulations (it is interesting in retrospect, because at that point we had no 
idea that they were designing what became Apple Silicon). I presume it would be 
possible to leverage the Apple matrix co-processor for fp128 operations (e.g., 
PowerPC used double-double arithmetics impleme