Dear all

I use a 64 bit Windows7 system with 16 GB memory.
I have run an R script in batch mode with the following command:
      R CMD BATCH  masterassignment_2012_11_09.r --save
The execution was terminated with the following error message:

        Error in memory.size(size) :  don't be silly!: your machine has a 4Gb 
address limit


I never get this error message in an interactive session (and I don't think I 
should, taking into account the system). To be sure, I have just double checked 
this again in an interactive session, and it works fine.

There are two possible explanations I can think of:
a) The command line doesn't "know" the system, i.e. it "thinks" it works on a 
32bit system (is this possible at all?)
b) I run R on a remote desktop, and the command line thinks I am calling R from 
my laptop

Anyway, I wouldn't know how to solve this.

In case you wonder why I run R in batch mode at all: I have a series of 
iterations in my model, each of which is taking huge amount of memory. Although 
I regularly impose a garbage collection, after two or three iterations, R 
crashes when used interactively. I thought that running and exiting R after 
each iteration would be the only effective way to "clean up" memory. If you 
have any other solutions, I would be happy to hear about them as well.



Laurent Franckx, PhD
VITO NV
Boeretang 200, 2400 MOL, Belgium
Tel. + 32 14 33 58 22
Skype: laurent.franckx
laurent.fran...@vito.be
Visit our website: www.vito.be/english and http://www.vito.be/transport






________________________________
 [http://www.vito.be/e-maildisclaimer/bsds.png]  <http://bsds.vito.be>

VITO Disclaimer: http://www.vito.be/e-maildisclaimer

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to