Newsgroups: comp.ai.neural-nets
Path: cantaloupe.srv.cs.cmu.edu!europa.chnt.gtegsc.com!news.sprintlink.net!redstone.interpath.net!sas!mozart.unx.sas.com!saswss
From: saswss@hotellng.unx.sas.com (Warren Sarle)
Subject: Re: A question about PCA
Originator: saswss@hotellng.unx.sas.com
Sender: news@unx.sas.com (Noter of Newsworthy Events)
Message-ID: <DBDtsL.M1r@unx.sas.com>
Date: Sat, 8 Jul 1995 05:26:45 GMT
X-Nntp-Posting-Host: hotellng.unx.sas.com
References: <3tep9n$5a3@bmsr14.usc.edu> <260894555wnr@chmqst.demon.co.uk>
Organization: SAS Institute Inc.
Lines: 27


In article <260894555wnr@chmqst.demon.co.uk>, David Livingstone <davel@chmqst.demon.co.uk> writes:
|> In article: <3tep9n$5a3@bmsr14.usc.edu>  saglam@bmsr14.usc.edu (Mehmet Akif Saglam)
|> writes:
|> >
|> > Does anybody know if principal component analysis can be
|> > used for supervised learning ?
|>
|> There is a technique called Partial Least Squares (PLS) which is a combination of
|> principal component analysis and principal component regression in the same step -
|> the components (latent variables) are generated so as have maximum correlation with
|> a dependent variable,

No. While PLS does have some similarity to principal component
regression, the components in PLS (which the PLS folks insist on calling
"latent variables", which is absolutely wrong) are not principal
components, so I hardly see how you could consider PLS a combination of
two things that both use principal components. Furthermore, the PLS
components are are not generated to have maximum correlation with the
dependent variable, since that would simply be multiple regression.


-- 

Warren S. Sarle       SAS Institute Inc.   The opinions expressed here
saswss@unx.sas.com    SAS Campus Drive     are mine and not necessarily
(919) 677-8000        Cary, NC 27513, USA  those of SAS Institute.
