On Tue, Nov 16, 2010 at 9:40 AM, Jeff Bassett <jbass...@cs.gmu.edu> wrote: > Giovanni, > > Both matrices describing the points (A and B in my example) are the > same size, so the resulting matrix will always be square. Also, the > equation I'm using is essentially the following identity: > > Var(A + B) = Var(A) + Var(B) + Cov(A, B) + Cov(B, A) > > All the covariance matrices that result from the Var() terms should be > positive definite, and while it seems possible that either of those > resulting from the Cov() terms may not be, the sum of the two should. > Do you agree? > > Of course, the above identity only holds if the data is normally > distributed.
Hi Jeff, as far as I can see, the identity above is an identity and holds irrespective of how your data is distributed (just write out the difinition of var and cov and you will see that the equation is always true). It is easy to come up with examples where Cov(A, B) + Cov(B, A) is not positive definite. As an extreme example, consider a matrix A (say 10 columns, 100 rows) such that the off-diagonal covariances are all zero and the columns are standardized, so cov(A) = diag(1, 1, 1, ...). Then take B = -A, so cov(A, B) = cov(B, A) = diag(-1, -1, -1, ...). Obviously, cov(A, B) + cov(B, A) is not positively definite, in fact it is negative definite. If the matrices A and B are completely independent, it is not very likely that cov(A,B) + cov(B,A) will be positive definite. For example, if A and B had just one column, there's a 50-50 chance that cov(A, B) is negative (single number). When you have more than one column, the chance is even less than 50-50 because all eigenvalues would have to be positive. You may be able to generate matrices whose cov(A, B) is positive definite by starting with a matrix A, then generating a random matrix B0, subtracting from the columns of B0 the projections of columns of B0 into the columns of A, then adding a small multiple of A to get B. But this may not be what you want or need. Alternatively (and more likely), something in your approach may need re-thinking. HTH, Peter > - Jeff > > > On Mon, Nov 15, 2010 at 3:56 PM, Giovanni Petris <gpet...@uark.edu> wrote: >> What made you think that a cross-covariance matrix should be positive >> definite? Id does not even need to be a square matrix, or symmetric. >> >> Giovanni Petris >> >> On Mon, 2010-11-15 at 12:58 -0500, Jeff Bassett wrote: >>> I am creating covariance matrices from sets of points, and I am having >>> frequent problems where I create matrices that are non-positive >>> definite. I've started using the corpcor package, which was >>> specifically designed to address these types of problems. It has >>> solved many of my problems, but I still have one left. >>> >>> One of the matrices I need to calculate is a cross-covariance matrix. >>> In other words, I need to calculate cov(A, B), where A and B are each >>> a matrix defining a set of points. The corpcor package does not seem >>> to be able to perform this operation. >>> >>> Can anyone suggest a way to create cross-covariance matrices that are >>> guaranteed (or at least likely) to be positive definite, either using >>> corpcor or another package? >>> >>> I'm using R 2.8.1 and corpcor 1.5.2 on Mac OS X 10.5.8. >>> >>> - Jeff >>> >>> ______________________________________________ >>> R-help@r-project.org mailing list >>> https://stat.ethz.ch/mailman/listinfo/r-help >>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html >>> and provide commented, minimal, self-contained, reproducible code. ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.