Hi All,

I created a .R file with source code that accesses functions from a R
package (example, fTrading).

I then run the created application in two different configurations:

1. I started a R session, and then ran the application using the source
("my_application.R") command, and I measured the time the application ran.

2. I started 2 R sessions in the same processor, and executed the same
source ("my_application.R") command, and measured the times the application
ran.

The times I measured for each applications in #2 was slower than the times I
measured for the application in #1.

The application was run in a 4-core machine running Linux.

When the application ran, i used "mpstat" to look at the CPU usage. For #1,
the CPU usage was 25%, and for #2, the CPU usage was 50%.

No other process was running in the machine.

My question is, why would #2 be slower than #1?

Thanks,
Peter

        [[alternative HTML version deleted]]

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to