Em 8/9/2011 05:24, Viechtbauer Wolfgang (STAT) escreveu:
I assume you mean Cohen's kappa. This is not what the OP is asking about. The OP wants to know how to test for a difference in the proportions of 1's. Cohen's kappa will tell you what the level of agreement is between the two tests. This is something different.
Indeed it is different.
Also, the OP has now clarified that the data are paired. Therefore, prop.test() and binom.test() are not appropriate. So, to answer the OPs question: yes, mcnemar.test() is what you should be using.
While it seems a consensus has been reached in the understanding of OP's wish, I still not completely sure, as the word "accuracy" has been used as synonymous of /proportions/ for the two tests.
Also as the description of "2 different diagnostic tests" has been written by the OP I offer my interpretation that the agreement between tests is to be considered.
OTOH if the accuracy can be assessed against a golden standard, I would suggest approaching the problem with confusion matrices.
HTH -- Cesar Rabak ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.