Re: [R] Vectorised operations

2016-05-19 Thread MacQueen, Don
In keeping with the theme of reducing unnecessary overhead (and using William's example data) > system.time( vAve <- ave(a, i, FUN=cummax) ) user system elapsed 0.125 0.003 0.127 > system.time( b <- unlist( lapply( split(a,i) , cummax) ) ) user system elapsed 0.320 0.007 0.327

Re: [R] Vectorised operations

2016-05-18 Thread Jim Lemon
Hi John, I may be misunderstanding what you want, but this seems to produce the output you specify: A<-sample(-10:100,100) i<-rep(1:10,c(5:13,19)) # replace the last value of x with the maximum max_last<-function(x) return(c(x[-length(x)],max(x))) as.vector(unlist(by(A,i,max_last))) and this is w

Re: [R] Vectorised operations

2016-05-18 Thread Bert Gunter
Sorry, I can't help, but wanted to note that "apply methods" are essentially just loops and rarely much faster or slower than explicit loops. (Those of us who use them do so for clarity and maintainability of code reasons). Cheers, Bert Bert Gunter "The trouble with having an open mind is that p

Re: [R] Vectorised operations

2016-05-18 Thread William Dunlap via R-help
ave(A, i, FUN=cummax) loops but is faster than your aggregate-based solution. E.g., > i <- rep(1:1, sample(0:210, replace=TRUE, size=1)) > length(i) [1] 1056119 > a <- sample(-50:50, replace=TRUE, size=length(i)) > system.time( vAve <- ave(a, i, FUN=cummax) ) user system elapsed 0.

[R] Vectorised operations

2016-05-18 Thread John Logsdon
Folks I have some very long vectors - typically 1 million long - which are indexed by another vector, same length, with values from 1 to a few thousand, sp each sub part of the vector may be a few hundred values long. I want to calculate the cumulative maximum of each sub part the main vector by