Dear Veronika,

> I'm calculating matrix correlations with permutation tests and I got this
> funny result.  All correlation coefficients are the same with mantel.test
> {ncf} and pcol {simba} but the two functions yield dramatically different
> p-values (using the same number of permutations). Could anyone please
> enlighten me what is causing the difference and which result I can trust?

I haven't used either of these functions, but I looked at the code (yes,
this is open source and you can see how things are done). It looks to me
that package 'ncf' does the permutations in the correct way: it takes a
square matrix and permutes its rows and columns, both in the same way
(dmat2[trek,trek] is the line). On the other hand, it seems that 'simba'
does not do things correctly, but it takes the lower triangle of
(dis)similarities as one long vector and permutes them freely disregarding
the original observations (rows, columns). Without knowing what kind of
results you got, I'd go for 'ncf' way of permutations (like I did in vegan).

Cheers, Jari Oksanen

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to