Hi Ravi,
My hunch would be "no" because it seems awfully inefficient. Packages
are mirrored all over the world, and it seems rather silly to be
mirroring, updating, etc. large datasets.
The good news is that if you just want a 10,000 x 100,000 matrix of
0/1s, it is trivial to generate:
X <- mat
I am looking for some large datasets (10,000 rows & 100,000 columns or vice
versa) to create some test sets. I am not concerned about the invidividual
elements since I will be converting them to binary (0/1) by using arbitrary
thresholds.
Does any R package provide such big datasets?
Also, what
2 matches
Mail list logo