Às 00:58 de 09/11/2024, Val escreveu:
Hi All,
I am reading data file ( > 1B rows) and do some date formatting like
dat=fread(mydatafile)
dat$date1 <- as.Date(ymd(dat$date1))
However, I am getting an error message saying that
Error: cons memory exhausted (limit reached?)
The
Thank you, I will take a look.
On Fri, Nov 8, 2024 at 8:09 PM Ben Bolker wrote:
>
> Check the "high performance task view" on CRAN ...
> https://cran.r-project.org/web/views/HighPerformanceComputing.html
>
> On Fri, Nov 8, 2024, 7:58 PM Val wrote:
>>
>> Hi All,
>>
>> I am reading data file ( >
Thank you Jeff for the tip! I don't think I have 4 times as much
free memory to process data... .
I allocated the max memory of the system.has.
On Fri, Nov 8, 2024 at 8:30 PM Jeff Newmiller wrote:
>
> There is always an implied "and do computations on it before writing the
> processed data o
There is always an implied "and do computations on it before writing the
processed data out" when reading chunks of a file.
And you would almost certainly not be getting that error if you were not out of
memory. A good rule of thumb is that you need 4 times as much free memory to
process data
Is the problem reading the file in or processing it after it has been read
in?
Bert
On Fri, Nov 8, 2024 at 5:13 PM Jeff Newmiller via R-help <
r-help@r-project.org> wrote:
> Can you tell us what is wrong with the "chunked" package which comes up
> when you Google "r read large file in chunks"?
>
Check the "high performance task view" on CRAN ...
https://cran.r-project.org/web/views/HighPerformanceComputing.html
On Fri, Nov 8, 2024, 7:58 PM Val wrote:
> Hi All,
>
> I am reading data file ( > 1B rows) and do some date formatting like
> dat=fread(mydatafile)
> dat$date1 <- as.Da
Hi Jeff,
Memory was not an issue. The system only used 75% of the memory
allocated for the job.
I am trying to understand what "r read large file in chunks" is doing.
On Fri, Nov 8, 2024 at 7:50 PM Jeff Newmiller wrote:
>
> Then you don't have enough memory to process the whole thing at once.
Then you don't have enough memory to process the whole thing at once. Not
unlike stuffing your mouth with cookies and not being able to chew for lack of
space to move the food around in your mouth.
Now, can you answer my question?
On November 8, 2024 5:38:37 PM PST, Val wrote:
>The data was r
The data was read. The problem is with processing.
On Fri, Nov 8, 2024 at 7:30 PM Bert Gunter wrote:
>
> Is the problem reading the file in or processing it after it has been read in?
>
> Bert
>
> On Fri, Nov 8, 2024 at 5:13 PM Jeff Newmiller via R-help
> wrote:
>>
>> Can you tell us what is wr
Can you tell us what is wrong with the "chunked" package which comes up when
you Google "r read large file in chunks"?
On November 8, 2024 4:58:18 PM PST, Val wrote:
>Hi All,
>
>I am reading data file ( > 1B rows) and do some date formatting like
> dat=fread(mydatafile)
> dat$date1 <- a
Hi All,
I am reading data file ( > 1B rows) and do some date formatting like
dat=fread(mydatafile)
dat$date1 <- as.Date(ymd(dat$date1))
However, I am getting an error message saying that
Error: cons memory exhausted (limit reached?)
The script was working when the number rows we
11 matches
Mail list logo