Re: [Rd] [tryExcept] New try Function
I have two related packages that are already submitted to CRAN but are awaiting approval. They are part of the R Documentation Task Force efforts to improve documentation. The exact function you are referring to I have called `catch_condition()` and included it in my `testextra` package. You might also want to check out the `pkgcond` package which facilitates creating informative conditions (errors, warnings, and messages). These conditions are automatically classed to tell you where the error is coming from, the package and function, including class for reference methods. https://github.com/RDocTaskForce/testextra https://github.com/RDocTaskForce/pkgcond On Fri, Nov 23, 2018 at 7:48 AM Ernest Benedito wrote: > Hi Emil, > > First, thanks for the response. As you mentioned, a lot of times tryCatch > does the work, as you can return a value. However, sometimes it is useful > to assign several variables when an error occurs. You could do it with <<-, > but I prefer to reduce it's usage unless completely necessary. > > I guess that the attachment was missed in the moderation. Here it is the > function: > > tryExcept <- function (expr, >except = {}) > { > doTryExcept <- function(expr, parentenv) { > .Internal(.addCondHands("error", list(NULL), parentenv, > environment(), FALSE)) > expr > } > parentenv <- parent.frame() > doTryExcept(return(expr), parentenv) > invisible(except) > } > > As you can see, the tryExcept function uses a simplified version of the > tryCatch architecture, but it allows you to pass by a second expression > that is evaluated in case an error occurs during the evaluation of the > first expression. It could even work as an infix operator: > > `%except%` <- tryExcept > > # dummy example > {foo <- "foo"} %except% {foo <- "foo bar"} > print(foo) # "foo" > > { foo <- "foo" > stop() > } %except% { > foo <- "foo bar" > } > print(foo) # "foo bar" > > It's main downside is that you are not able to handle the error occured, > although there is the possibility to add a 'silent' parameter such as in > 'try' in order to print the error if desired. All in all, this would be a > function for simple error handling, but I think it would be practical, and > you can always move to tryCatch if you need a more complex error handling. > > I will be looking forward to hearing your insights. > > Best, > Ernest Benedito > > Missatge de Emil Bode del dia dv., 23 de nov. > 2018 > a les 13:17: > > > Hi Ernest, > > > > To start: I don't see an attachment, I think they're not (always) allowed > > on this mailing-list. If you want to send something, text is your safest > > bet. > > But regarding the issue of tryCatch: I think you're not fully using what > > it already can do. In almost all circumstances I've encountered the > > following works fine: > > res <- tryCatch(expr, error = function(cond) { > > # a bunch of code > > # Some value to be stored in res > > }) > > The only difference is that now "#abunchofcode" is run from inside a > > function, which means you're working in a different environment, and if > you > > want to assign values to other variables you need to use <<- or assign. > > For a modified function, I think it would be nice if there's a way to > > supply an expression instead of a function, so that evaluation (and > > assignment!) takes place in the same environment as the main code in the > > tryCatch (in expr). Is that what you made? > > And with the current tryCatch, you could use something like this: > > res <- tryCatch(expr, error=function(e) evalq({ > > # a bunch of code > > # Some value for res > > }, envir=parent.frame(4))) # The 4 is because some internal functions are > > involved, parent.frame(4) is the same environment as used by expr > > > > Although this is cumbersome, and it gets even more cumbersome if you want > > to access the error-object in #abunchofcode, or use #abunchofcode to > return > > to a higher level, so I get it you're looking for a more elegant > solution. > > > > Best regards, > > Emil Bode > > > > On 23/11/2018, 08:49, "R-devel on behalf of Ernest Benedito" < > > r-devel-boun...@r-project.org on behalf of ebenedi...@gmail.com> wrote: > > > > Hi everyone, > > > > When dealing with errors, sometimes I want to run a bunch of code > when > > an error occurs. > > For now I usually use a structure such as: > > > > res <- tryCatch(expr, error = function(cond) cond) # or try(expr) > > > > if (inherits(res, “error”)) # or inherits(res, “try-error”) > > # a bunch of code > > > > I though it would be useful to have a function that does this > > naturally, so I came up with the attached function. > > > > I would be glad to hear your insights and if you think it would make > > sense to add this function to R. > > > > Best regards, > > Ernest > > __ > > R-devel@r-project.org mailing list > > htt
[Rd] R-devel from 1997 Good Read
hi all This is possibly a bit off topic. However, the mailing list is quiet at the moment so I thought I would mention it. Recently, I had a look at R-devel (R-alpha?) from 1997. And it's very informative and very cool. I'm planning to go through it more carefully when I get some more free time. I encourage other R supporters to do the same. kind regards Abs __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] issue with testInstalledPackage
Background: I run tools::testInstalledPackage on all packages that dependend on survival (605 as of today) before sending a new release to CRAN. It has a few false positives which I then follow up on. (Mostly packages with as-yet-incomplete tests in their inst directory). Issue: testInstalledPackage("mets") generates an "Error in checkVignettes(pkg, lib.loc, latex = FALSE, weave = TRUE)" message, which stops my script. The source code for that package passes R CMD check. I can easily work around this in the short term by adding mets to my do-not-check list. Footnote: the newer "check_packages_in_dir" routine doesn't work for me. The biggest reason is that it doesn't have a way to give a list of packages to skip. My little desktop box doesn't have every linux library (cryptography, geospatial, etc.), nor do I load up bioconductor; which leads to a boatload of false positives. I keep adding things but the packages add faster. Terry T. [[alternative HTML version deleted]] __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Suggestion for `glm.fit`
Dear sirs, One gets unexpected `residuals` if one is not aware of the meaning of weights when a weight is set to zero and the outcome is one in the `binomial` family in a call to `glm.fit`. The reason is the following line from `binomial()$initialize` > y[weights == 0] <- 0 Here is an example: pval <- seq(.05, .95, length.out = 25) X <- log(pval / (1 - pval)) - 2 Y <- c( FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, TRUE, FALSE, FALSE, FALSE, FALSE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, FALSE, TRUE, TRUE, TRUE, TRUE, FALSE, TRUE, TRUE) W <- rep(1, length(Y)) W[length(W)] <- 0 fit <- glm(Y ~ X, binomial(), weights = W) fit$residuals[25] #R25 #R -45.77847 # Maybe it should be the following. Otherwise maybe there should be a # warning in `binomial()$initialize` when `y`s are set to zero? with( fit, tail((Y - fitted.values) / binomial()$mu.eta(linear.predictors), 1)) #R 25 #R 1.022332 sessionInfo() #R R version 3.5.1 (2018-07-02) #R Platform: x86_64-w64-mingw32/x64 (64-bit) #R Running under: Windows >= 8 x64 (build 9200) #R #R Matrix products: default #R #R locale: #R [1] LC_COLLATE=English_United States.1252 #R [2] LC_CTYPE=English_United States.1252 #R [3] LC_MONETARY=English_United States.1252 #R [4] LC_NUMERIC=C #R [5] LC_TIME=English_United States.1252 #R #R attached base packages: #R [1] stats graphics grDevices utils datasets methods #R [7] base #R #R loaded via a namespace (and not attached): #R [1] compiler_3.5.1 tools_3.5.1yaml_2.1.18 Sincerely yours, Benjamin Christoffersen __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Suggestion for `glm.fit`
I don't know whether this helps or not, but using residuals(fit) rather than fit$residuals returns 0 for the last value. This is different from (predict(fit,type="response")[25] - Y[25]) (or the equivalent Pearson residual) because the *weighted* residuals are returned by definition (not that I see this being explained super-clearly in the documentation) ... FWIW, ?residuals says: The abbreviated form ‘resid’ is an alias for ‘residuals’. It is intended to encourage users to access object components through an accessor function rather than by directly referencing an object slot. - in other words, if you use fit$residuals you're expected to know exactly what you're doing and how things might get weird ... On 2018-11-26 5:08 p.m., Benjamin Christoffersen wrote: > Dear sirs, > > One gets unexpected `residuals` if one is not aware of the meaning of > weights when a weight is set to zero and the outcome is one in the > `binomial` family in a call to `glm.fit`. The reason is the following > line from `binomial()$initialize` >> y[weights == 0] <- 0 > > Here is an example: > pval <- seq(.05, .95, length.out = 25) > X <- log(pval / (1 - pval)) - 2 > Y <- c( > FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, TRUE, FALSE, FALSE, > FALSE, FALSE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, FALSE, TRUE, > TRUE, TRUE, TRUE, FALSE, TRUE, TRUE) > > W <- rep(1, length(Y)) > W[length(W)] <- 0 > fit <- glm(Y ~ X, binomial(), weights = W) > fit$residuals[25] > #R25 > #R -45.77847 > > # Maybe it should be the following. Otherwise maybe there should be a > # warning in `binomial()$initialize` when `y`s are set to zero? > with( > fit, tail((Y - fitted.values) / binomial()$mu.eta(linear.predictors), 1)) > #R 25 > #R 1.022332 > > sessionInfo() > #R R version 3.5.1 (2018-07-02) > #R Platform: x86_64-w64-mingw32/x64 (64-bit) > #R Running under: Windows >= 8 x64 (build 9200) > #R > #R Matrix products: default > #R > #R locale: > #R [1] LC_COLLATE=English_United States.1252 > #R [2] LC_CTYPE=English_United States.1252 > #R [3] LC_MONETARY=English_United States.1252 > #R [4] LC_NUMERIC=C > #R [5] LC_TIME=English_United States.1252 > #R > #R attached base packages: > #R [1] stats graphics grDevices utils datasets methods > #R [7] base > #R > #R loaded via a namespace (and not attached): > #R [1] compiler_3.5.1 tools_3.5.1yaml_2.1.18 > > Sincerely yours, > Benjamin Christoffersen > > __ > R-devel@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] issue with testInstalledPackage
FWIW I've been reasonably happy with the revdepcheck package: it's not base-R, but it's pretty robust (lme4 'only' has 286 dependencies to check ...) I've had much better luck running it on a remote server (where the sysadmin is responsive so it's not too much trouble to get extra system dependencies/Debian packages installed as they become necessary). On 2018-11-26 3:42 p.m., Therneau, Terry M., Ph.D. via R-devel wrote: > Background: I run tools::testInstalledPackage on all packages that dependend > on survival > (605 as of today) before sending a new release to CRAN. It has a few false > positives which > I then follow up on. (Mostly packages with as-yet-incomplete tests in their > inst directory). > > Issue: testInstalledPackage("mets") generates an "Error in > checkVignettes(pkg, > lib.loc, latex = FALSE, weave = TRUE)" message, which stops my script. The > source code > for that package passes R CMD check. > I can easily work around this in the short term by adding mets to my > do-not-check list. > > Footnote: the newer "check_packages_in_dir" routine doesn't work for me. > The biggest > reason is that it doesn't have a way to give a list of packages to skip. My > little > desktop box doesn't have every linux library (cryptography, geospatial, > etc.), nor do I > load up bioconductor; which leads to a boatload of false positives. I keep > adding things > but the packages add faster. > > Terry T. > > > [[alternative HTML version deleted]] > > __ > R-devel@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Subsetting row in single column matrix drops names in resulting vector
Dmitriy Selivanov (selivanov.dmit...@gmail.com) wrote: > Consider following example: > > a = matrix(1:2, nrow = 2, dimnames = list(c("row1", "row2"), c("col1"))) > a[1, ] > # 1 > > It returns *unnamed* vector `1` where I would expect named vector. In fact > it returns named vector when number of columns is > 1. > Same issue applicable to single row matrix. Is it a bug? looks very > counterintuitive. This and related issues are addressed in pqR, in the new release of 2018-11-18. (See pqR-project.org, and my blog post at radfordneal.wordpress.com) The behaviour of a[1,] is unchanged, for backwards compatibility reasons. But in pqR one can explicitly mark an argument as missing using "_". When an array subscript is missing in this way, the names will not be dropped in this context even if there is only one of them. So a[1,_] will do what you want: > a = matrix(1:2, nrow = 2, dimnames = list(c("row1", "row2"), c("col1"))) > a[1, ] [1] 1 > a[1,_] col1 1 Furthermore, pqR will not drop names when the subscript is a 1D array (ie, has a length-1 dim attribute) even if it is only one long. In pqR, sequences that are 1D arrays are easily created using the .. operator. So the following works as intended when .. is used, but not when the old : operator is used: > a = matrix(1:4, nrow=2, dimnames=list(c("row1","row2"),c("col1","col2"))) > n = 2 > a[1,1:n] col1 col2 13 > a[1,1..n] col1 col2 13 > n = 1 > a[1,1:n] [1] 1 > a[1,1..n] col1 1 You can read more about this in my blog post at https://radfordneal.wordpress.com/2016/06/25/fixing-rs-design-flaws-in-a-new-version-of-pqr/ That was written when most of these features where introduced, though getting your specific example right relies on another change introduced in the most recent version. Radford Neal __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel