Re: [R] Limit

2024-11-08 Thread Rui Barradas
Às 00:58 de 09/11/2024, Val escreveu: Hi All, I am reading data file ( > 1B rows) and do some date formatting like dat=fread(mydatafile) dat$date1 <- as.Date(ymd(dat$date1)) However, I am getting an error message saying that Error: cons memory exhausted (limit reached?) The

Re: [R] Limit

2024-11-08 Thread Val
Thank you, I will take a look. On Fri, Nov 8, 2024 at 8:09 PM Ben Bolker wrote: > > Check the "high performance task view" on CRAN ... > https://cran.r-project.org/web/views/HighPerformanceComputing.html > > On Fri, Nov 8, 2024, 7:58 PM Val wrote: >> >> Hi All, >> >> I am reading data file ( >

Re: [R] Limit

2024-11-08 Thread Val
Thank you Jeff for the tip! I don't think I have 4 times as much free memory to process data... . I allocated the max memory of the system.has. On Fri, Nov 8, 2024 at 8:30 PM Jeff Newmiller wrote: > > There is always an implied "and do computations on it before writing the > processed data o

Re: [R] Limit

2024-11-08 Thread Jeff Newmiller via R-help
There is always an implied "and do computations on it before writing the processed data out" when reading chunks of a file. And you would almost certainly not be getting that error if you were not out of memory. A good rule of thumb is that you need 4 times as much free memory to process data

Re: [R] Limit

2024-11-08 Thread Bert Gunter
Is the problem reading the file in or processing it after it has been read in? Bert On Fri, Nov 8, 2024 at 5:13 PM Jeff Newmiller via R-help < r-help@r-project.org> wrote: > Can you tell us what is wrong with the "chunked" package which comes up > when you Google "r read large file in chunks"? >

Re: [R] Limit

2024-11-08 Thread Ben Bolker
Check the "high performance task view" on CRAN ... https://cran.r-project.org/web/views/HighPerformanceComputing.html On Fri, Nov 8, 2024, 7:58 PM Val wrote: > Hi All, > > I am reading data file ( > 1B rows) and do some date formatting like > dat=fread(mydatafile) > dat$date1 <- as.Da

Re: [R] Limit

2024-11-08 Thread Val
Hi Jeff, Memory was not an issue. The system only used 75% of the memory allocated for the job. I am trying to understand what "r read large file in chunks" is doing. On Fri, Nov 8, 2024 at 7:50 PM Jeff Newmiller wrote: > > Then you don't have enough memory to process the whole thing at once.

Re: [R] Limit

2024-11-08 Thread Jeff Newmiller via R-help
Then you don't have enough memory to process the whole thing at once. Not unlike stuffing your mouth with cookies and not being able to chew for lack of space to move the food around in your mouth. Now, can you answer my question? On November 8, 2024 5:38:37 PM PST, Val wrote: >The data was r

Re: [R] Limit

2024-11-08 Thread Val
The data was read. The problem is with processing. On Fri, Nov 8, 2024 at 7:30 PM Bert Gunter wrote: > > Is the problem reading the file in or processing it after it has been read in? > > Bert > > On Fri, Nov 8, 2024 at 5:13 PM Jeff Newmiller via R-help > wrote: >> >> Can you tell us what is wr

Re: [R] Limit

2024-11-08 Thread Jeff Newmiller via R-help
Can you tell us what is wrong with the "chunked" package which comes up when you Google "r read large file in chunks"? On November 8, 2024 4:58:18 PM PST, Val wrote: >Hi All, > >I am reading data file ( > 1B rows) and do some date formatting like > dat=fread(mydatafile) > dat$date1 <- a

[R] Limit

2024-11-08 Thread Val
Hi All, I am reading data file ( > 1B rows) and do some date formatting like dat=fread(mydatafile) dat$date1 <- as.Date(ymd(dat$date1)) However, I am getting an error message saying that Error: cons memory exhausted (limit reached?) The script was working when the number rows we