?? pa 765 discriminant function analysis.mht
字號:
<UL>No. In discriminant analysis the groups (clusters) are determined=20
beforehand and the object is to determine the linear combination of=20
independent variables which best discriminates among the groups. In =
cluster=20
analysis the groups (clusters) are not predetermined and in fact the =
object=20
is to determine the best way in which cases may be clustered into =
groups.=20
</UL>
<P><A name=3Dconstant></A></P>
<LI><B>When does the discriminant function have no constant term?</B>=20
<UL>When the data are standardized or are deviations from the mean. =
</UL>
<P><A name=3Dhov></A></P>
<LI><B>How important is it that the assumptions of homogeneity of =
variances=20
and of multivariate normal distribution be met?</B>=20
<UL>Lachenbruch (1975) indicates that DA is relatively robust even =
when=20
there are modest violations of these assumptions. Klecka (1980) =
points out=20
that dichotomous variables, which often violate multivariate =
normality, are=20
not likely to affect conclusions based on DA. </UL>
<P><A name=3Dbetas></A></P>
<LI><B>In DA, how can you assess the relative importance of the =
discriminating=20
variables?</B>=20
<UL>The same as in regression, by comparing beta weights. If not =
output=20
directly by one's statistical package (SPSS does), one may obtain =
beta=20
weights by running DA on standardized scores. That is, betas are=20
standardized discriminant function coefficients. The ratio of the =
betas is=20
the relative contribution of each variable. Note that the betas will =
change=20
if variables are added or deleted from the equation.=20
<P><I>Dummy variables</I>. As in regression, dummy variables must be =
assessed as a group, not on the basis of individual beta weights. =
This is=20
done through <B>hierarchical discriminant analysis</B>, running the =
analysis=20
first with, then without the set of dummies. The difference in the =
squared=20
canonical correlation indicates the explanatory effect of the set of =
dummies.=20
<P>Alternatively, for interval independents, one can correlate the=20
discriminant function scores with the independents. The =
discriminating=20
variables which matter the most to a particular function will be =
correlated=20
highest with the DA scores. </P></UL>
<P><A name=3Dmle></A></P>
<LI><B>What is the maximum likelihood estimation method in =
discriminant=20
analysis (logistic discriminate function analysis)?</B>=20
<UL>Using mle, a discriminant function is a function of the form T =3D =
k1X1 +=20
k2X2 + ... + knXn, where X1...Xn are the differences between the two =
groups=20
on the ith independent variable, k1...kn are the logit coefficients, =
and T=20
is a function which classes the case into group 0 or group 1. If the =
data=20
are unstandardized, there is also a constant term. The discriminant =
function=20
arrives at coefficients which set the highest possible ratio of=20
between-group to within-groups variance (similar to the ANOVA F =
test, except=20
that in DA the group variable is the dependent rather than the =
independent).=20
This method, called <B>logistic discriminate function analysis</B>, =
is=20
supported by SPSS. </UL>
<P><A name=3Dfisher></A></P>
<LI><B>What are Fisher's linear discriminant functions? </B>
<UL>The classical method of discriminant classification calculated one =
set=20
of discriminant function coefficients for each dependent category, =
using=20
these to make the classifications. SPSS still outputs these =
coefficients if=20
you check the "Fisher's" box under the Statistics option in =
discriminant=20
function analysis. </UL>
<P><A name=3Dstep></A></P>
<LI><B>What is stepwise DA?</B>=20
<UL>Stepwise procedures select the most correlated independent first, =
remove=20
the variance in the dependent, then select the second independent =
which most=20
correlates with the remaining variance in the dependent, and so on =
until=20
selection of an additional independent does not increase the =
R-squared (in=20
DA, canonical R-squared) by a significant amount (usually =
signif=3D.05). As in=20
multiple regression, there are both forward (adding variables) and =
backward=20
(removing variables) stepwise versions.=20
<P>In SPSS there are several available criteria for entering or =
removing new=20
variables at each step: Wilks?lambda, unexplained variance,=20
Mahalanobis?distance, smallest F ratio, and Rao=92s V. The =
researcher typically=20
sets the critical significance level by setting the "F to remove" in =
most=20
statistical packages.=20
<P>Stepwise procedures are sometimes said to eliminate the problem =
of=20
multicollinearity, but this is misleading. The stepwise procedure =
uses an=20
intelligent criterion to set order, but it certainly does not =
eliminate the=20
problem of multicollinearity. To the extent that independents are =
highly=20
intercorrelated, the standard errors of their standardized =
discriminant=20
coefficients will be inflated and it will be difficult to assess the =
relative importance of the independent variables.=20
<P>The researcher should keep in mind that the stepwise method =
capitalizes=20
on chance associations and thus significance levels are worse (that =
is,=20
numerically higher) than the true alpha significance rate reported. =
Thus a=20
reported significance level of .05 may correspond to a true alpha =
rate of=20
.10 or worse. </P></UL>
<P><A name=3Dmancova></A></P>
<LI><B>I have heard DA is related to MANCOVA. How so?</B>=20
<UL>Discriminant analysis can be conceptualized as the inverse of =
MANCOVA.=20
MANCOVA can be used to see the effect on multiple dependents of a =
single=20
categorial independent, while DA can be used to see the effect on a=20
categorical dependent of multiple interval independents. The SPSS =
MANOVA=20
procedure, which also covers MANCOVA, can be used to generate =
discriminant=20
functions as well, though in practical terms this is not the easiest =
route=20
for the researcher interested in DA. </UL>
<P></P></LI></UL>
<UL></UL>
<P><BR>
<H2>Bibliography</H2>
<UL>
<LI>George H. Dunteman (1984). <I>Introduction to multivariate =
analysis</I>.=20
Thousand Oaks, CA: Sage Publications. Chapter 5 covers classification=20
procedures and discriminant analysis.=20
<P></P>
<LI>Klecka, William R. (1980). <I>Discriminant Analysis</I>. =
Quantitative=20
Applications in the Social Sciences Series, No. 19. Thousand Oaks, CA: =
Sage=20
Publications.=20
<P>Lachenbruch, P. A. (1975). <I>Discriminant Analysis</I>. NY: =
Hafner.=20
<P></P>
<LI>Press, S. J. and S. Wilson (1978). Choosing between logistic =
regression=20
and discriminant analysis. <I>Journal of the American Statistical=20
Association</I>, Vol. 73: 699-705. The authors make the case for the=20
superiority of logistic regression for situations where the =
assumptions of=20
multivariate normality are not met (ex., when dummy variables are =
used),=20
though discriminant analysis is held to be better when assumptions are =
met.=20
They conclude that logistic and discriminant analyses will usually =
yield the=20
same conclusions, except in the case when there are independents which =
result=20
in predictions very close to 0 and 1 in logistic analysis. </LI></UL>
<P>
<P>
<HR>
<CENTER>
<SCRIPT language=3DJavaScript>
<!-- hide from other browsers
function goHist(a)=20
{
history.go(a); // Go back one.
}
//<!-- finish hiding -->
</SCRIPT>
<FORM method=3Dpost><INPUT onclick=3DgoHist(-1) type=3Dbutton =
value=3DBack>=20
</FORM></CENTER>
<HR>
</BODY></HTML>
------=_NextPart_000_0007_01C00AE6.8CA5B600
Content-Type: image/jpeg
Content-Transfer-Encoding: base64
Content-Location: http://www2.chass.ncsu.edu/garson/pa765/redbar2.jpg
/9j/4AAQSkZJRgABAgEASABIAAD/7QuIUGhvdG9zaG9wIDMuMAA4QklNA+0AAAAAABAASAAAAAEA
AQBIAAAAAQABOEJJTQPzAAAAAAAIAAAAAAAAAAA4QklNBAoAAAAAAAEAADhCSU0nEAAAAAAACgAB
AAAAAAAAAAI4QklNA/UAAAAAAEgAL2ZmAAEAbGZmAAYAAAAAAAEAL2ZmAAEAoZmaAAYAAAAAAAEA
MgAAAAEAWgAAAAYAAAAAAAEANQAAAAEALQAAAAYAAAAAAAE4QklNA/gAAAAAAHAAAP//////////
//////////////////8D6AAAAAD/////////////////////////////A+gAAAAA////////////
/////////////////wPoAAAAAP////////////////////////////8D6AAAOEJJTQQIAAAAAAAQ
AAAAAQAAAkAAAAJAAAAAADhCSU0ECQAAAAAKFwAAAAEAAACAAAAAVgAAAYAAAIEAAAAJ+wAYAAH/
2P/gABBKRklGAAECAQBIAEgAAP/+ACdGaWxlIHdyaXR0ZW4gYnkgQWRvYmUgUGhvdG9zaG9wqCA0
LjAA/+4ADkFkb2JlAGSAAAAAAf/bAIQADAgICAkIDAkJDBELCgsRFQ8MDA8VGBMTFRMTGBEMDAwM
DAwRDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAENCwsNDg0QDg4QFA4ODhQUDg4ODhQRDAwM
DAwREQwMDAwMDBEMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwM/8AAEQgAVgCAAwEiAAIRAQMR
Af/dAAQACP/EAT8AAAEFAQEBAQEBAAAAAAAAAAMAAQIEBQYHCAkKCwEAAQUBAQEBAQEAAAAAAAAA
AQACAwQFBgcICQoLEAABBAEDAgQCBQcGCAUDDDMBAAIRAwQhEjEFQVFhEyJxgTIGFJGhsUIjJBVS
wWIzNHKC0UMHJZJT8OHxY3M1FqKygyZEk1RkRcKjdDYX0lXiZfKzhMPTdePzRieUpIW0lcTU5PSl
tcXV5fVWZnaGlqa2xtbm9jdHV2d3h5ent8fX5/cRAAICAQIEBAMEBQYHBwYFNQEAAhEDITESBEFR
YXEiEwUygZEUobFCI8FS0fAzJGLhcoKSQ1MVY3M08SUGFqKygwcmNcLSRJNUoxdkRVU2dGXi8rOE
w9N14/NGlKSFtJXE1OT0pbXF1eX1VmZ2hpamtsbW5vYnN0dXZ3eHl6e3x//aAAwDAQACEQMRAD8A
84SSSW+yKSSSSUpJJJJSkk7RJ/KiCqWl0e0clWsHJZc0eMGMY3wjjuPF/c9LHPLGJo6+SJJEFYOg
Cu9L+r3WOsG4dNx/X9Db6vvYyN+7Z/PPr3fzb03meUny8eKcoED5uE/JfymfEI8PEqGWMttPNzkl
c6r0fqPSMhuN1Gn0LnsFjW7mvlpLmB26p1jfpMeqarAgiwbHgyKSSSRUpJJJJT//0POEkklvsikk
kklKTsYXmB8dVKqqy1waxrnk9mgk/gruBZdiXucGQ7aWkPB8R/V/dVrl+V9yEshkIxj0/SkUcURK
ImTGEjUsgjx8P0/SX6e7GbXe22sPc9oFZ2gwYd+99FPtbEQI8F0P1QwOnZOS/JzLzQ/DfVZQN7WB
xBe87/UB3fzbPoLpvrbmYdv1fymV31vefThrXtJMWVngFLmfiBHMxxQxS4ZSiCQahj4uHUen1LDD
LPDAmJ4MYlwyEdxxccuOb5tkYdlVDcsForsdta0TM6+X8hbf1M6N17qv2z9kZgw/R9L1ptsq3bvV
9P8AmGv37dln01mY2S/GeXsAJI2+77+0eCXUMY3Cq3HDrrLJdc1nu2k7XR7PofnfSWgZnPi+73GE
pAD3ZD3KGP1+rH6eP5f30+xDJj93DfHjA93BvI/oe5jl+lxy9XtcH6t2f8aX/igx/wDwmz/z5kLj
lsdc67l9ezWZea2ut7KxUBUC1u0F9n+EfZ7t1iybBDzHHisyfIT5bDjEzeShxRjH0R+b/KLceQSN
UxSSSUDKpJJJJT//0fOEkklvsikhyEkbDaHZlDXAFpsYCDqCC4J0K4o3qLCYxMiIg1xHhv8AvJsN
1tVjbKvbtJ10PIjurDnF7i52rnGSfMrTHS3ZvVKcDEFdT7mkifa2Wh73bvTa781n7q6vqf1S+0dG
xcTErxqc2n0/Xv27d21jmW/pWV+o/fZ7/erXOc9hxZYR4Y4xLpH08Ef35MfMYcuGc+XlMzGKR6ng
4/3hF4WnJuon0nbd0ToDx/WUrM3KtYa7Hy08iAONewUsHAuzrPTqLWmWt9xIEuMD6Icr2f8AVnPw
MR+VdZU6uuJDC4n3EM03Vt/eUhnGJAJoy2DDk57Jy8MWGeeeKHMAjDi45jHliZ+3MRh8v845CLTk
3UT6Ttu6J0B4/rKz0fo+T1jJdjYzmMexhsJsJAgFrPzG2f6RdX1X6nXZOHgVYbcam7Hr25T4LfUf
trbv3V1F1nuZZ/OKDLzmLFkjCcgDLf8Aqafpf3k45ZISE8cpQlH5ZQPDLX06Si8HjdNyr/e2vcwE
gncBrH9ZRzun347A99e1pcADIOsE+KsMuurEMe5o5hpI/Iq2dfc9u19jnDcDBJI4VzmOYOQfrDIR
qpRxnhv/ABuJmxjlBjAEcvv18xlD2eP+7we5w/4bSSSSWUas1suUkkkkp//S84SSSW+yKRsH+m4/
/Gs/6oIK0OmYV1+Syylm5tL2OeZAgTP5x/kpk8sMVSySEIgi5SPDGPnKS7HIRyQMjQEo2Ttu9N0q
+mj6z4dt9jaqmsfue8hrRLLmiXO9v0l2v7a6N/3Pxv8At5n/AJJed9Qwcq65rq2bmhoEyBrJ8Sq3
7Kz/APRf9Jv/AJJM5uXw7mMgmedwxqPDplxf3v3v6yPiGeB5zPKEozjKdiUTxRPkW79Wv6WP+Mq/
6orpfrP/AMh5P9j/AM+VrnuiY1+Lkb727G72GZB0aTu+juWx17qGNlYFuDTZvybQ0srgiQHte73u
AZ9FjvzlJn+IcmcuLh5jDKtzHLjPD6v0vU4nxqEsuT4OcY4xh4vd4PX7X9L4/wBZw/J6PX63M+pG
VjYvVrbMm5lDDjuaHWODATvqO3c8jwXb/tro3/c/G/7eZ/5JeZfsrP8A9F/0m/8Akkv2Vn/6L/pN
/wDJKHmT8Oz5PcPO4YmgKGXF0/wnQjkAFaNRVssEjTxH5FqfsrP/ANF/0m/+SRsPo5daftlM17dP
d+dI/wBG791T838V5KGCcxzGLIYi+DHkxzyT8IR40RmIm3m9rvApLW6szp1b214Yh7C5tw92hEAf
zn9v6KyVDy2f38YyCEoCW0cg4Z/WLZjLiF1XmpJJJTrn/9PzhJJJb7IpWKsu2okU2Pr3RO0ls/Ha
VXSBgymThGYqQ4h2KCAXp+lOysnpto9Vxt9SGvc4yABW76WrkTpz7/tdlVtjn7GuBBJIkOaO65f1
fJL1fJYub4Gck8khkEBkNiPAPR/z2A4LvXfwepF1nTmudmPdaHiW7SXRt+l/Obf3kI9f6cXh5qsL
xoHbWyPnvWFjZ32fd7N26O8cfJQycr7RaLNu2G7YmfHy/lJuL4FHjIygzv8AykT7Q/u+3xSWjCeK
iLFfPdf816wdRoOIzL2u9OwloECdC4fvfyFK7PppZW9wcRaNzYA8jrr/AClxvq+SXq+SYf8Ai6OI
kZSI2Tw8P6P6MeLj/RT928XtcbJZksL2AgA7fd9/afFNmVX21BtD/TcHSTJGkH91cX6vkl6vkm/8
nJCVjNXgcd/+pFfdzeh/BLnB7MmxrjLg9wcZ5IKrp3O3RpEJl0GKHBCMTvEV9jYAoAKSSSUiX//U
84SVFJb7I3klRSSU3klRSSU3klRSSU3klRSSU3klRSSU3klRSSU3klRSSU//2QA4QklNBAYAAAAA
AAcAAQAAAAEBAP/+ACdGaWxlIHdyaXR0ZW4gYnkgQWRvYmUgUGhvdG9zaG9wqCA0LjAA/+4ADkFk
b2JlAGSAAAAAAf/bAIQADAgICAkIDAkJDBELCgsRFQ8MDA8VGBMTFRMTGBEMDAwMDAwRDAwMDAwM
DAwMDAwMDAwMDAwMDAwMDAwMDAwMDAENCwsNDg0QDg4QFA4ODhQUDg4ODhQRDAwMDAwREQwMDAwM
DBEMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwM/8AAEQgAGQAlAwEiAAIRAQMRAf/dAAQAA//E
AT8AAAEFAQEBAQEBAAAAAAAAAAMAAQIEBQYHCAkKCwEAAQUBAQEBAQEAAAAAAAAAAQACAwQFBgcI
CQoLEAABBAEDAgQCBQcGCAUDDDMBAAIRAwQhEjEFQVFhEyJxgTIGFJGhsUIjJBVSwWIzNHKC0UMH
JZJT8OHxY3M1FqKygyZEk1RkRcKjdDYX0lXiZfKzhMPTdePzRieUpIW0lcTU5PSltcXV5fVWZnaG
lqa2xtbm9jdHV2d3h5ent8fX5/cRAAICAQIEBAMEBQYHBwYFNQEAAhEDITESBEFRYXEiEwUygZEU
obFCI8FS0fAzJGLhcoKSQ1MVY3M08SUGFqKygwcmNcLSRJNUoxdkRVU2dGXi8rOEw9N14/NGlKSF
tJXE1OT0pbXF1eX1VmZ2hpamtsbW5vYnN0dXZ3eHl6e3x//aAAwDAQACEQMRAD8A89RcfHfe/a3R
o+k+NGjzQl0/Tuk3jpQ6lj2NrFFlUDUO3vFZ9Td/IfbV/wBP/rkdSrQGWojp3k9BhjAyBmajcY1+
/OfphBqYn1b+059WDY9+K+x+xxsDZaSNzP0djsf6f/Gf8T6tixHNc07XAtI5B0K7PI6J1I5PUTk5
DH34NbX3Olzt+5nqNaHOa36NLP8Atz/txcr1E1m+WNLdNZIM/c1qixTjMCUJGUJDihKXzTifllHh
ZcwwyhxwkBOIhGUID0eocfH/AIUZRaqSSSlar//Q89WjV1OtoYSyHMAGgMAtI2fn/R9NZySZrpv8
wqv3/wBD/C/ddTmeKhw/eLv/AMB+7x/4fsuvlddOZfZk5TTbkPPufIAMBrGfRbX+YP8ARqlm5NN2
30W7QPpbgJ/z5cqqShxe1Ufb4eGhw8FcPB+jw8P6LWxcPuY/92cF+j3/AH/uvy+j+e9H+zUkkkpX
Qf/Z
------=_NextPart_000_0007_01C00AE6.8CA5B600
Content-Type: image/jpeg
Content-Transfer-Encoding: base64
Content-Location: http://www2.chass.ncsu.edu/garson/pa765/nav1.jpg
/9j/4AAQSkZJRgABAgEASABIAAD/7QbwUGhvdG9zaG9wIDMuMAA4QklNA+0AAAAAABAASAAAAAEA
AQBIAAAAAQABOEJJTQPzAAAAAAAIAAAAAAAAAAA4QklNBAoAAAAAAAEAADhCSU0nEAAAAAAACgAB
AAAAAAAAAAI4QklNA/UAAAAAAEgAL2ZmAAEAbGZmAAYAAAAAAAEAL2ZmAAEAoZmaAAYAAAAAAAEA
MgAAAAEAWgAAAAYAAAAAAAEANQAAAAEALQAAAAYAAAAAAAE4QklNA/gAAAAAAHAAAP//////////
?? 快捷鍵說明
復制代碼
Ctrl + C
搜索代碼
Ctrl + F
全屏模式
F11
切換主題
Ctrl + Shift + D
顯示快捷鍵
?
增大字號
Ctrl + =
減小字號
Ctrl + -