The test is only an example. The data is an example too. The difference is not the problem, I think, because we can make the data larger and the difference will grow. In my system, the original test points to Windows having the best time, and points to difference larger than 10% between linux generic binaries and linux compiled from source. I'll try deal with the advanced users' suggestions (to investigate better) and report here at the list.
Thanks, C. On Mon, Jun 29, 2009 at 10:20 PM, Raymond Wan <r....@aist.go.jp> wrote: > > Hi, > > > I. Soumpasis wrote: > >> 2009/6/29 C$Bq[(Bar Freitas <cafanselm...@yahoo.com.br> >> This is true. So I tried the same computer with windows XP and ubuntu 8.10 >> 64bit dual core @3Gz and 4MB RAM >> Windows 32bit results: >> user system elapsed >> 21.66 0.02 21.69 >> Linux 64bit Results >> user system elapsed >> 27.242 0.004 27.275 >> >> This difference is small and it is truly explained by what the advanced >> users have said. >> > > > One minor comment which I forgot to mention is that a difference of 6 > seconds for system that ran 30 seconds is worth noting, but may not be > statistically significant. Especially when we're now talking about two > completely different OS' and, thus, two different ways of timing a program. > If whatever data file you are using is also on a different file system, > then one could be fragmented, etc. > > My point is that your test might be true (and others have given you reasons > for it), but also don't worry too much about it... :-) > > Ray > > > [[alternative HTML version deleted]] ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.