David Warde-Farley wrote:
> On 9-Jun-09, at 3:54 AM, David Cournapeau wrote:
>
>   
>> For example, what ML people call PCA is called Karhunen Loéve in  
>> signal
>> processing, and the concepts are quite similar.
>>     
>
>
> Yup. This seems to be a nice set of review notes:
>
>       http://www.ece.rutgers.edu/~orfanidi/ece525/svd.pdf
>   

This looks indeed like a very nice review from a signal processing
approach. I never took the time to understand the
similarities/differences/connections between traditional SP approaches
and the machine learning approach. I wonder if the subspaces methods ala
PENCIL/MUSIC and co have a (useful) interpretation in a more ML
approach, I never really thought about it. I guess other people had :)

> And going further than just PCA/KLT, tying it together with maximum  
> likelihood factor analysis/linear dynamical systems/hidden Markov  
> models,
>
>       http://www.cs.toronto.edu/~roweis/papers/NC110201.pdf
>   

As much as I like this paper, I always felt that you miss a lot of
insights when considering PCA only from a purely statistical POV. I
really like the consideration of PCA within a function approximation POV
(the chapter 9 of the Mallat book on wavelet is cristal clear, for
example, and it is based on all those cool functional spaces theory
likes Besov space).

cheers,

David
_______________________________________________
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to