Thank you Jeff for the tip!   I don't think I have  4 times as much
free memory to process data... .
I  allocated the max memory of the system.has.

On Fri, Nov 8, 2024 at 8:30 PM Jeff Newmiller <jdnew...@dcn.davis.ca.us> wrote:
>
> There is always an implied "and do computations on it before writing the 
> processed data out" when reading chunks of a file.
>
> And you would almost certainly not be getting that error if you were not out 
> of memory.  A good rule of thumb is that you need 4 times as much free memory 
> to process data than you need to read it in.
>
> On November 8, 2024 6:08:16 PM PST, Val <valkr...@gmail.com> wrote:
> >Hi Jeff,
> >
> >Memory was not an issue. The system only used 75% of the memory
> >allocated for the job.
> >
> > I am trying to understand what  "r read large file in chunks" is doing.
> >
> >On Fri, Nov 8, 2024 at 7:50 PM Jeff Newmiller <jdnew...@dcn.davis.ca.us> 
> >wrote:
> >>
> >> Then you don't have enough memory to process the whole thing at once. Not 
> >> unlike stuffing your mouth with cookies and not being able to chew for 
> >> lack of space to move the food around in your mouth.
> >>
> >> Now, can you answer my question?
> >>
> >> On November 8, 2024 5:38:37 PM PST, Val <valkr...@gmail.com> wrote:
> >> >The data was read. The problem is with processing.
> >> >
> >> >On Fri, Nov 8, 2024 at 7:30 PM Bert Gunter <bgunter.4...@gmail.com> wrote:
> >> >>
> >> >> Is the problem reading the file in or processing it after it has been 
> >> >> read in?
> >> >>
> >> >> Bert
> >> >>
> >> >> On Fri, Nov 8, 2024 at 5:13 PM Jeff Newmiller via R-help 
> >> >> <r-help@r-project.org> wrote:
> >> >>>
> >> >>> Can you tell us what is wrong with the "chunked" package which comes 
> >> >>> up when you Google "r read large file in chunks"?
> >> >>>
> >> >>> On November 8, 2024 4:58:18 PM PST, Val <valkr...@gmail.com> wrote:
> >> >>> >Hi All,
> >> >>> >
> >> >>> >I am reading data file ( > 1B rows) and do some date formatting like
> >> >>> >      dat=fread(mydatafile)
> >> >>> >     dat$date1 <- as.Date(ymd(dat$date1))
> >> >>> >
> >> >>> >However, I am getting an error message saying that
> >> >>> >    Error: cons memory exhausted (limit reached?)
> >> >>> >
> >> >>> >The  script was working  when the number rows were  around 650M.
> >> >>> >
> >> >>> >Is there another way to  handle  a big data set in R?
> >> >>> >
> >> >>> >
> >> >>> >Thank you.
> >> >>> >
> >> >>> >______________________________________________
> >> >>> >R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> >> >>> >https://stat.ethz.ch/mailman/listinfo/r-help
> >> >>> >PLEASE do read the posting guide 
> >> >>> >https://www.R-project.org/posting-guide.html
> >> >>> >and provide commented, minimal, self-contained, reproducible code.
> >> >>>
> >> >>> --
> >> >>> Sent from my phone. Please excuse my brevity.
> >> >>>
> >> >>> ______________________________________________
> >> >>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> >> >>> https://stat.ethz.ch/mailman/listinfo/r-help
> >> >>> PLEASE do read the posting guide 
> >> >>> https://www.R-project.org/posting-guide.html
> >> >>> and provide commented, minimal, self-contained, reproducible code.
> >>
> >> --
> >> Sent from my phone. Please excuse my brevity.
>
> --
> Sent from my phone. Please excuse my brevity.

______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide https://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to