Hi all,
How are you using the values? How significant are the differences?
i am using these eigenvectors to do PCA on a set of images(of faces).I
sort the eigenvectors in descending order of their eigenvalues and
this is multiplied with the (orig data of some images viz a matrix)to
obtain a
riginal Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of [EMAIL PROTECTED]
Sent: miƩrcoles, 20 de febrero de 2008 14:45
To: numpy-discussion@scipy.org
Subject: Re: [Numpy-discussion] finding eigenvectors etc
> How are you using the values? How significant are the di
You should have such differences, that's strange. Are you sure you're using
the correct eigenvectors ?
Matthieu
2008/2/20, [EMAIL PROTECTED] <[EMAIL PROTECTED]>:
>
>
> > How are you using the values? How significant are the differences?
> >
>
>
> i am using these eigenvectors to do PCA on a set o
> How are you using the values? How significant are the differences?
>
i am using these eigenvectors to do PCA on a set of images(of faces).I
sort the eigenvectors in descending order of their eigenvalues and
this is multiplied with the (orig data of some images viz a matrix)to
obtain a facespac
On Feb 20, 2008 1:00 AM, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
> > Different implementations follow different conventions as to which
> > is which.
>
> thank you for the replies ..the reason why i asked was that the most
> significant eigenvectors ( sorted according to eigenvalues) are lat
The vectors that you used to build your covariance matrix all lay in or close
to
a 3-dimensional subspace of the 4-dimensional space in which they were
represented. So one of the eigenvalues of the covariance matrix is 0, or close
to it; the matrix is singular. Condition is the ratio of the l
>
> >Your matrix is almost singular, is badly conditionned,
>
> Mathew, can you explain that..i didn't quite get it..
> dn
>
The condition number is the ratio between the biggest eigenvalue and the
lowest one. In your case, it is 10E-16, so the precision of the double
numbers. That means that some
> Different implementations follow different conventions as to which
> is which.
thank you for the replies ..the reason why i asked was that the most
significant eigenvectors ( sorted according to eigenvalues) are later
used in calculations and then the results obtained differ in java and
python
Yes.
Your first eigenvalue is effectively 0, the values you see are just noise.
Different implementations produce different noise.
As for the signs ot the eigenvector components, which direction is + or - X is
arbitrary. Different implementations follow different conventions as to which
is wh
Hi,
The results are OK, they are very close. Your matrix is almost singular, is
badly conditionned, ... But the results are very close is you check them in
a relative way. 3.84433376e-03 or -6.835301757686207E-4 is the same compared
to 2.76980401e+13
Matthieu
2008/2/20, [EMAIL PROTECTED] <[EMAIL
hi
i was calculating eigenvalues and eigenvectors for a covariancematrix
using numpy
adjfaces=matrix(adjarr)
faces_trans=adjfaces.transpose()
covarmat=adjfaces*faces_trans
evalues,evect=eigh(covarmat)
for a sample covarmat like
[[ 1.69365981e+13 , -5.44960784e+12, -9.00346400e+12 , -2.48352625e
11 matches
Mail list logo