I�ve been reading in binary data collected via LabView for a project, and after 
upgrading to R 3.5.0, the code returns an error indicating that the 'vector 
memory is exhausted�.  I�m happy to provide a sample binary file; even ones 
that are quite small (12 MB) generate this error. (I wasn�t sure whether a 
binary file attached to this email would trigger a spam filter.)

bin.read = file(files[i], "rb�)
datavals = readBin(bin.read, integer(), size = 2, n = 8*hertz*60*60000, endian 
= "little�)

Error: vector memory exhausted (limit reached?)


sessionInfo()
R version 3.5.0 (2018-04-23)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS Sierra 10.12.6


This does not happen in R 3.4 (R version 3.4.4 (2018-03-15) -- "Someone to Lean 
On�) - the vector is created and populated by the binary file values without 
issue, even at a 1GB binary file size.

Other files that are read in as csv files, even at 1GB, load correctly to 3.5, 
so I assume that this is a function of a vector being explicitly 
defined/changed in some way from 3.4 to 3.5.

Any help, suggestions or workarounds are greatly appreciated!
Val

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to