Hello list,

I have been working on learning ggplot for its extraordinary flexibility 
compared to base plotting and have been developing a function to create a 
"Minitab-like" process capability chart.  

*sigh* some of the people I interface with can only understand the data when it 
is presented in Minitab format

The function creates a ggplot container to hold 10 ggplot items which are the 
main process capability chart, a Q-Q plot, and the text boxes with all the 
capabilities data.  When I run the function, the elapsed time is on the order 
of 3 seconds, the gross majority of which is user time.  sys time is very 
small.  A bit of hacking shows that the calls to 

gt1 <- ggplot_gtable(ggplot_build(p)), 

etc., each take on the order of 1/3 of a second. These times are on a 3.2GHz 
Xeon workstation.  I'd like to see the entire function complete in less than a 
second.  My questions are: 1) Am I misusing ggplot, hence the performance hit? 
2) Is there any way to increase the speed of this portion of the code? 3) Am I 
simply asking ggplot to crunch so much that it is inevitable that it will take 
a while to process?

To that end, the function, vectis.cap(), can be downloaded from 
http://pastebin.com/05s5RKYw .  It runs to 962 lines of code, so I won't paste 
it here.  The offending ggplot_gtable calls are at lines 909 - 918.

Usage:
vectis.cap(chickwts$weight, target = 300, USL = 400, LSL = 100)

Thank you,

Christopher Battles

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to