t standard errors is easily implemented with
he command "cluster(id)". Is there something similar in R?
Thank you for any advice,
Marc
Gesendet: Dienstag, 01. Juli 2014 um 10:07 Uhr
Von: "Marc Jekel"
An: r-help@r-project.org
Betreff: maximum likelihood
Dear R community,
I am still struggling a bit on how R does memory allocation and how to optimize my code to minimize
working memory load. Simon (thanks!) and others gave me a hint to use the command "gc()"
to clean up memory which works quite nice but appears to me to be more like a "fix" to a
Dear R community,
I was observing a memory issue in R (latest 64bit R version running on a
win 7 64 bit system) that made me curious.
I kept track of the memory f my PC allocated to R to calculate + keep
several objects in the workspace. If I then save the workspace, close R,
and open the wo
don't know but I checked
several R books and could not find a command to do this).
It would be great to have a function in which it is possible to define a
sequence of commands that can be applied column/row-wise.
Thanks for a hint,
Marc
--
Dipl.-Psych. Marc Jekel
MPI for Research on
Dear R-Fans,
I have been lately working on some plots in R that I save as pdf via the
pdf() command. I have realized that when I open those files in Adobe and
then re-save it within Adobe ("save as..."), the size of the pdf files
decreases rapidly (e.g., from 4mb to 1mb). This can also be obse
Hi again,
I have checked the same code (see below) using MATLAB. It produces the
same error (i.e., equal numbers are evaluated as unequal). Do I miss
something?
Thanks for help!
Marc
Marc Jekel schrieb:
Hello R Fans,
Another question for the community that really frightened me today
Hello R Fans,
Another question for the community that really frightened me today. The
following logical comparison produces a "false" as output:
t = sum((c(.7,.69,.68,.67,.66)-.5)*c(1,1,-1,-1,1))
tt = sum((c(.7,.69,.68,.67,.66)-.5)*c(1,-1,1,1,-1))
t == tt
This is really strange behavior. Mos
Dear R Fans,
I have the opportunity to buy a new computer for my simulations in R. My
goal is to get the execution of R code as fast as possible. I know that
the number of cores and the working memory capacity are crucial for
computer performance but maybe someone has experience/knowledge whic
Dear R Fans,
I was recently asking myself how quick R is in code execution. I have
been running simulations lately that need quite a time for execution and
I was wondering if it is reasonable at all to do more computational
extensive projects with R. Of course, it is possible to compare
execu
Hi R Fans,
I stumbled across a strange (I think) bug in R 2.9.1. I have read in a
data file with 5934 rows and 9 columns with the commands:
daten = data.frame(read.table("C:/fussball.dat",header=TRUE))
Then I needed a subset of the data file:
newd = daten[daten[,1]!=daten[,2],]
--> two valu
Dear R Fans,
I have a simple probem but cannot find any reference to the soultion.
I want to do calculations with small numbers (for max likelihood estimations).
The lowest value R is storing by default is 1*10^-323, a smaller numer like
1*10^-324 is stored as a 0. How can I circumvent this pro
11 matches
Mail list logo