Good day,

I notice that summing rows of a large dgTMatrix fails.

library(Matrix)
aMatrix <- new("dgTMatrix",
                i = as.integer(sample(200000, 10000)-1), j = 
as.integer(sample(50000, 10000)-1), x = rnorm(10000),
                Dim = c(200000L, 50000L)
              )
totals <- rowSums(aMatrix == 0)  # Segmentation fault.

The server has 768 GB of RAM and it was never close to being consumed by this. 
Converting it to an ordinary matrix works fine.

big <- as.matrix(aMatrix)
totals <- rowSums(big == 0)      # Uses more RAM but there is no segmentation 
fault and result is returned.

May it be made more robust for dgTMatrix?

--------------------------------------
Dario Strbenac
University of Sydney
Camperdown NSW 2050
Australia

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to