[Rd] Rcartogram package - error message

2015-07-24 Thread SuzukiBlue
I am trying to install two R packages to produce cartograms like in the work
of Gastner and Newman:
http://www.pnas.org/content/101/20/7499.full.pdf

but I have a problem installing Rcartogram and rdyncall packages. 
Both are not available in CRAN and have to be installed from archivea and
produce errors:

> install.packages("C:/Users/Milena/Downloads/*Rcartogram*_0.2-2.tar.gz",
> repos = NULL, type = "source")

Installing package into ‘C:/Users/Milena/Documents/R/win-library/3.2’
(as ‘lib’ is unspecified)

* installing *source* package 'Rcartogram' ...

   **
   WARNING: this package has a configure script
 It probably needs manual configuration
   **


** libs

*** arch - i386

Warning: running command 'make -f "Makevars" -f
"C:/PROGRA~1/R/R-3.2.0/etc/i386/Makeconf" -f
"C:/PROGRA~1/R/R-3.2.0/share/make/winshlib.mk" SHLIB="Rcartogram.dll"
OBJECTS="Rcart.o cart.o"' had status 127

ERROR: compilation failed for package 'Rcartogram'
* removing 'C:/Users/Milena/Documents/R/win-library/3.2/Rcartogram'

Warning in install.packages :
  running command '"C:/PROGRA~1/R/R-3.2.0/bin/x64/R" CMD INSTALL -l
"C:\Users\Milena\Documents\R\win-library\3.2"
"C:/Users/Milena/Downloads/Rcartogram_0.2-2.tar.gz"' had status 1

Warning in install.packages :
  installation of package
‘C:/Users/Milena/Downloads/Rcartogram_0.2-2.tar.gz’ had non-zero exit status


> install.packages("C:/Users/Milena/Downloads/*rdyncall*_0.7.5.tar.gz",
> repos = NULL, type = "source")

Installing package into ‘C:/Users/Milena/Documents/R/win-library/3.2’
(as ‘lib’ is unspecified)

* installing *source* package 'rdyncall' ...
** package 'rdyncall' successfully unpacked and MD5 sums checked
** libs

*** arch - i386

Warning: running command 'make -f "Makevars.win" -f
"C:/PROGRA~1/R/R-3.2.0/etc/i386/Makeconf" -f
"C:/PROGRA~1/R/R-3.2.0/share/make/winshlib.mk" SHLIB="rdyncall.dll"
OBJECTS="rcallback.o rdyncall.o rdynload.o rpack.o rpackage.o rutils.o
rutils_float.o rutils_str.o"' had status 127

ERROR: compilation failed for package 'rdyncall'
* removing 'C:/Users/Milena/Documents/R/win-library/3.2/rdyncall'

Warning in install.packages :
  running command '"C:/PROGRA~1/R/R-3.2.0/bin/x64/R" CMD INSTALL -l
"C:\Users\Milena\Documents\R\win-library\3.2"
"C:/Users/Milena/Downloads/rdyncall_0.7.5.tar.gz"' had status 1

Warning in install.packages :
  installation of package ‘C:/Users/Milena/Downloads/rdyncall_0.7.5.tar.gz’
had non-zero exit status

*rdyncall* finally loaded on a machine of a friend but not on mine, is there
something wrong with my installation?

following are my sessionInfo()

R version 3.2.0 (2015-04-16)
Platform: *x86_64-w64-mingw32/x64 (64-bit)*
Running under: *Windows 8 x64 (build 9200)
*
locale:
[1] LC_COLLATE=English_United Kingdom.1252  LC_CTYPE=English_United
Kingdom.1252LC_MONETARY=English_United Kingdom.1252 LC_NUMERIC=C
   
[5] LC_TIME=English_United Kingdom.1252

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base 

other attached packages:
[1] sp_1.1-1

loaded via a namespace (and not attached):
[1] tools_3.2.0 grid_3.2.0  lattice_0.20-31





--
View this message in context: 
http://r.789695.n4.nabble.com/Rcartogram-package-error-message-tp4710313.html
Sent from the R devel mailing list archive at Nabble.com.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Memory limitations for parallel::mclapply

2015-07-24 Thread Joshua Bradley
Hello,

I have been having issues using parallel::mclapply in a memory-efficient
way and would like some guidance. I am using a 40 core machine with 96 GB
of RAM. I've tried to run mclapply with 20, 30, and 40 mc.cores and it has
practically brought the machine to a standstill each time to the point
where I do a hard reset.

When running mclapply with 10 mc.cores, I can see that each process takes
7.4% (~7 GB) of memory. My use-case for mclapply is the following: run
mclapply over a list of 15 names, for each process I refer to a larger
pre-computed data.table to compute some stats with the name, and return
those stats . Ideally I want to use the large data.table as shared-memory
but the number of mc.cores I can use are being limited because each one
requires 7 GB. Someone posted this exact same issue

on
stackoverflow a couple years ago but it never got answered.

Do I have to manually tell mclapply to use shared memory (if so, how?)? Is
this type of job better with the doParallel package and foreach approach?

Josh Bradley

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Memory limitations for parallel::mclapply

2015-07-24 Thread Ista Zahn
Hi Josh,

I think we need some more details, including code, and information
about your operating system. My machine has only 12 Gb of ram, but I
can run this quite comfortably (no swap, other processes using memory
etc.):

library(parallel)
library(data.table)
d <- data.table(a = rnorm(5000),
b = runif(1:5000),
c = sample(letters, 5000, replace = TRUE),
d = 1:5000,
g = rep(letters[1:10], each = 500))

system.time(means <- mclapply(unique(d$g), function(x) sapply(d[g==x,
list(a, b, d)], mean), mc.cores = 5))

In other words, I don't think there is anything inherent the the kind
of operation you describe that requires the large data object to be
copied. So as usual the devil is in the details, which you haven't yet
described.

Best,
Ista


On Fri, Jul 24, 2015 at 4:21 PM, Joshua Bradley  wrote:
> Hello,
>
> I have been having issues using parallel::mclapply in a memory-efficient
> way and would like some guidance. I am using a 40 core machine with 96 GB
> of RAM. I've tried to run mclapply with 20, 30, and 40 mc.cores and it has
> practically brought the machine to a standstill each time to the point
> where I do a hard reset.
>
> When running mclapply with 10 mc.cores, I can see that each process takes
> 7.4% (~7 GB) of memory. My use-case for mclapply is the following: run
> mclapply over a list of 15 names, for each process I refer to a larger
> pre-computed data.table to compute some stats with the name, and return
> those stats . Ideally I want to use the large data.table as shared-memory
> but the number of mc.cores I can use are being limited because each one
> requires 7 GB. Someone posted this exact same issue
> 
> on
> stackoverflow a couple years ago but it never got answered.
>
> Do I have to manually tell mclapply to use shared memory (if so, how?)? Is
> this type of job better with the doParallel package and foreach approach?
>
> Josh Bradley
>
> [[alternative HTML version deleted]]
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel