On 12-08-09 3:41 PM, Eric Kaufmann wrote:
I have a researcher who is running a large R job The job is currently on
3725 iterations. The total number of iterations is 27500. Is there a way to
kill this and recover the data that has been generated thus far?
Not likely, unless the process is writing it out to file.
It's conceivable that you could cause a debugger to interrupt the
process (or cause the OS to abort it and dump memory to a file for post
mortem debugging), but it's not at all easy to find anything useful in
such a situation.
Duncan Murdoch
Thanks,
Eric
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.