Dear R Developers

I'm using the great randomForest package (4.6-7) for many projects and recently 
stumbled upon a problem when I wrote unit tests for one of my projects:

On Windows, there are small numeric deviations when using the 32- / 64-bit 
version of R, which doesn't seem to be a problem on Linux or Mac.

R64 on Windows produces the same results as R64/R32 on Linux or Mac:

> set.seed(131)
> importance(randomForest(Species ~ ., data=iris))
             MeanDecreaseGini
Sepal.Length         9.452470
Sepal.Width          2.037092
Petal.Length        43.603071
Petal.Width         44.116904

R32 on Windows produces the following:

> set.seed(131)
> importance(randomForest(Species ~ ., data=iris))
             MeanDecreaseGini
Sepal.Length         9.433986
Sepal.Width          2.249871
Petal.Length        43.594159
Petal.Width         43.941870

Is there a reason why this is different for the Windows builds? Are the 
compilers on Windows doing different things for 32- / 64-bit builds than the 
ones on Linux or Mac?

Thank you very much for your help.

Best regards,
George

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to