Hello,
Would around two orders of magnitude interess?
f1 <- function(Nodes, Weights){
drop.index <- duplicated(Nodes)
n.unique <- Nodes[!drop.index, ]
w.unique <- numeric(length(n.unique[,1]))
lw <- length(Weights)
for (i in seq_along(w.unique)){
You say "typical size of nodes --> 20x2"
Do you mean that nodes has many many rows? Or many many columns? Or
both?
This is minimalist coding, but I'm not sure how fast it will run on your
data
aggregate(w, as.data.frame(nodes), sum)
Jean
"Weiser, Constantin" wrote on 06/28/2012
Hi, all together. I have - a maybe trivial - problem with aggregating a
list of weights.
Here is the problem:
- At first I have set of nodes (X/Y-coordinates) and associated weights,
where the set
of nodes is typically not unique
- I want to get a set of unique nodes and the sum of associated
3 matches
Mail list logo