Re: [R] R CMD INSTALL fails where R CMD check succeeds.

2012-02-24 Thread Rob Steele
That's it! I was mounting /tmp with the noexec option. That was the problem. Thanks! On 2/24/2012 12:56 PM, Prof Brian Ripley wrote: Guess: did you set TMPDIR to somewhere you are allowed to execute scripts? __ R-help@r-project.org mailing list h

[R] R CMD INSTALL fails where R CMD check succeeds.

2012-02-24 Thread Rob Steele
This is in a 64 bit CentOS 5.6 instance at Amazon AWS with R version 2.14.1 (2011-12-22). It happens on several packages: RMySQL, RODBC, FastICA. Many other packages install just fine. Here's an example error message: * installing *source* package 'RODBC' ... ** package 'RODBC' successfully

Re: [R] Memory leak in system() command?

2010-09-07 Thread Rob Steele
You're quite right; the problem lies in an SQL query ("LOAD DATA INFILE"). Thanks! On 9/7/2010 6:50 AM, Duncan Murdoch wrote: > Rob Steele wrote: >> I run an external program a few hundred times via >> >> status <- system(command = "blah blah blah&quo

[R] Memory leak in system() command?

2010-09-07 Thread Rob Steele
I run an external program a few hundred times via status <- system(command = "blah blah blah") and pretty soon Linux thinks R is using tons of memory. R doesn't think so, at least not according to gc(). I'm also opening, reading and closing files but I don't think that's where the problem lies.

Re: [R] Moving quantile()?

2009-11-26 Thread Rob Steele
Charles C. Berry wrote: > On Thu, 26 Nov 2009, Rob Steele wrote: > >> Is there a faster way to get moving quantiles from a time series than to >> run quantile() at each step in the series? > > > Yes. > > Run > > help.request() > > Since

Re: [R] Moving quantile()?

2009-11-26 Thread Rob Steele
Charles C. Berry wrote: > On Thu, 26 Nov 2009, Rob Steele wrote: > >> Is there a faster way to get moving quantiles from a time series than to >> run quantile() at each step in the series? > > > Yes. > > Run > > help.request() > > Since

Re: [R] Best way to preallocate numeric NA array?

2009-11-26 Thread Rob Steele
Douglas Bates wrote: > On Thu, Nov 26, 2009 at 10:03 AM, Rob Steele > wrote: >> These are the ways that occur to me. >> >> ## This produces a logical vector, which will get converted to a numeric >> ## vector the first time a number is assigned to it. That seems >

Re: [R] Best way to preallocate numeric NA array?

2009-11-26 Thread Rob Steele
Douglas Bates wrote: > On Thu, Nov 26, 2009 at 10:03 AM, Rob Steele > wrote: >> These are the ways that occur to me. >> >> ## This produces a logical vector, which will get converted to a numeric >> ## vector the first time a number is assigned to it. That seems >

[R] Best way to preallocate numeric NA array?

2009-11-26 Thread Rob Steele
These are the ways that occur to me. ## This produces a logical vector, which will get converted to a numeric ## vector the first time a number is assigned to it. That seems ## wasteful. x <- rep(NA, n) ## This does the conversion ahead of time but it's still creating a ## logical vector first,

[R] Moving quantile()?

2009-11-26 Thread Rob Steele
Is there a faster way to get moving quantiles from a time series than to run quantile() at each step in the series? Thanks, Rob __ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www

Re: [R] Reading large files quickly; resolved

2009-05-11 Thread Rob Steele
Rob Steele wrote: > I'm finding that readLines() and read.fwf() take nearly two hours to > work through a 3.5 GB file, even when reading in large (100 MB) chunks. > The unix command wc by contrast processes the same file in three > minutes. Is there a faster way to read files

Re: [R] Reading large files quickly

2009-05-10 Thread Rob Steele
meric? Are you keeping it in > a dataframe? Have you considered using 'scan' to read in the data and to > specify what the columns are? If you would like some more help, the answer > to these questions will help. > > On Sat, May 9, 2009 at 10:09 PM, Rob Steele >

Re: [R] Reading large files quickly

2009-05-09 Thread Rob Steele
Thanks guys, good suggestions. To clarify, I'm running on a fast multi-core server with 16 GB RAM under 64 bit CentOS 5 and R 2.8.1. Paging shouldn't be an issue since I'm reading in chunks and not trying to store the whole file in memory at once. Thanks again. Rob Steele wrote

[R] Reading large files quickly

2009-05-09 Thread Rob Steele
I'm finding that readLines() and read.fwf() take nearly two hours to work through a 3.5 GB file, even when reading in large (100 MB) chunks. The unix command wc by contrast processes the same file in three minutes. Is there a faster way to read files in R? Thanks! __