亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? psy6003 logistic regression and discriminant analysis.mht

?? 這是博弈論算法全集第六部分:局面描述,其它算法將陸續推出.以便與大家共享
?? MHT
?? 第 1 頁 / 共 3 頁
字號:
From: <由 Microsoft Internet Explorer 5 保存>
Subject: PSY6003: Logistic regression and discriminant analysis
Date: Sun, 20 Aug 2000 17:13:15 +0800
MIME-Version: 1.0
Content-Type: multipart/related;
	boundary="----=_NextPart_000_000F_01C00AC9.ED4B1940";
	type="text/html"
X-MimeOLE: Produced By Microsoft MimeOLE V5.00.2615.200

This is a multi-part message in MIME format.

------=_NextPart_000_000F_01C00AC9.ED4B1940
Content-Type: text/html;
	charset="gb2312"
Content-Transfer-Encoding: quoted-printable
Content-Location: http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<HTML><HEAD><TITLE>PSY6003: Logistic regression and discriminant =
analysis</TITLE>
<META content=3D"text/html; charset=3Dgb2312" http-equiv=3DContent-Type>
<META content=3D"MSHTML 5.00.2614.3500" name=3DGENERATOR></HEAD>
<BODY background=3Dhttp://www.ex.ac.uk/Psychology/images/background.gif>
<CENTER>
<P><!--replace the message below with the title of your page--><!-- page =
header (standard department page header) --><A=20
href=3D"http://www.exeter.ac.uk/"><IMG alt=3D"University of Exeter" =
height=3D59=20
src=3D"http://www.exeter.ac.uk/uoelogos.gif" width=3D70></A> =
</P></CENTER>
<H4 align=3Dcenter><A =
href=3D"http://www.ex.ac.uk/Psychology/">DEPARTMENT OF=20
PSYCHOLOGY</A></H4>
<H2 align=3Dcenter>
<HR SIZE=3D6>
<!-- replace the message below with the title of your page--></H2>
<H2 align=3Dcenter>PSY6003 Advanced statistics: <A=20
href=3D"http://info.ex.ac.uk/~SEGLea/multvar2">Multivariate analysis II: =
Manifest=20
variables analyses</A></H2>
<H1 align=3Dcenter><A name=3Dtop></A>Topic 4: Logistic regression and =
discriminant=20
analysis</H1>
<P>
<HR SIZE=3D6>
<!-- end of page header -->Contents of this handout: <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#problem">The =
problem of=20
dichotomous dependent variables</A>; <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#discrim">Discr=
iminant=20
analysis</A>; <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#logtheory">Log=
istic=20
regression - theory</A>; Logistic regression (and discriminant analysis) =
<A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#inpractice">in=
=20
practice</A>; <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#report">Interp=
reting=20
and reporting</A> logistic regression results; <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#refs">Referenc=
es=20
</A>and further reading; <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#examples">Exam=
ples</A>.
<P></P>
<H3><A name=3Dproblem></A>The Problem: Categorical dependent =
variables</H3>
<P>A limitation of ordinary linear models is the requirement that the =
dependent=20
variable is numerical rather than categorical. But many interesting =
variables=20
are categorical - patients may live or die, people may pass or fail MScs =
and so=20
on. A range of techniques have been developed for analysing data with=20
categorical dependent variables, including <B>discriminant analysis, =
probit=20
analysis, log-linear regression </B>and<B> logistic regression</B>. To =
contrast=20
it with these, the kind of regression we have used so far is usually =
referred to=20
as <B>linear regression</B>. </P>
<P>The various techniques listed above are applicable in different =
situations:=20
for example log-linear regression require all regressors to be =
categorical,=20
whilst discriminant analysis strictly require them all to be continuous =
(though=20
dummy variables can be used as for multiple regression). In SPSS at =
least,=20
logistic regression is easier to use than discriminant analysis when we =
have a=20
mixture of numerical and categorical regressors, because it includes =
procedures=20
for generating the necessary dummy variables automatically. </P>
<P><I>back to <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#top">top</A></=
I></P>
<H3><A name=3Ddiscrim></A>Discriminant analysis</H3>
<P>The major purpose of discriminant analysis is to predict membership =
in two or=20
more mutually exclusive groups from a set of predictors, when there is =
no=20
natural ordering on the groups. So we may ask whether we can predict =
whether=20
people vote Labour or Conservative from a knowledge of their age, their =
class,=20
attitudes, values etc etc. </P>
<P>Discriminant analysis is just the inverse of a one-way <B>MANOVA</B>, =
the=20
multivariate analysis of variance. The levels of the independent =
variable (or=20
factor) for Manova become the categories of the dependent variable for=20
discriminant analysis, and the dependent variables of the Manova become =
the=20
predictors for discriminant analysis. In MANOVA we ask whether group =
membership=20
produces reliable differences on a combination of dependent variables. =
If the=20
answer to that question is 'yes' then clearly that combination of =
variables can=20
be used to predict group membership. Mathematically, MANOVA and =
discriminant=20
analysis are the same; indeed, the SPSS MANOVA command can be used to =
print out=20
the <B>discriminant</B> <B>functions</B> that are at the heart of =
discriminant=20
analysis, though this is not usually the easiest way of obtaining them. =
These=20
discriminant functions are the linear combinations of the =
<B>standardised</B>=20
independent variables which yield the biggest mean differences between =
the=20
groups. If the dependent variable is a <B>dichotomy</B>, there is one=20
discriminant function; if there are <I>k</I> levels of the dependent =
variable,=20
up to <I>k</I>-1 discriminant functions can be extracted, and we can =
test how=20
many it is worth extracting. Successive discriminant functions are=20
<B>orthogonal</B> to one another, like <B>principal</B> =
<B>components</B>, but=20
they are not the same as the principal components you would obtain if =
you just=20
did a principal components analysis on the independent variables, =
because they=20
are constructed to maximise the differences between the values of the =
dependent=20
variable. </P>
<P>The commonest use of discriminant analysis is where there are just =
two=20
categories in the dependent variable; but as we have seen, it can be =
used for=20
multi-way categories (just as MANOVA can be used to test the =
significance of=20
differences between several groups, not just two). This is an advantage =
over=20
logistic regression, which is always described for the problem of a =
dichotomous=20
dependent variable. </P>
<P>You will encounter discriminant analysis fairly often in journals. =
But it is=20
now being replaced with logistic regression, as this approach requires =
fewer=20
assumptions in theory, is more statistically robust in practice, and is =
easier=20
to use and understand than discriminant analysis. So we will concentrate =
on=20
logistic regression. </P>
<P><I>back to <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#top">top</A></=
I></P>
<H3><A name=3Dlogtheory></A>Logistic regression: theory</H3>
<P>Just like linear regression, logistic regression gives each regressor =
a=20
coefficient <I>b</I><SUB>1</SUB> which measures the regressor's =
independent=20
contribution to variations in the dependent variable. But there are =
technical=20
problems with dependent variables that can only take values of 0 and 1. =
What we=20
want to predict from a knowledge of relevant independent variables is =
not a=20
precise numerical value of a dependent variable, but rather the =
probability=20
(<I>p</I>) that it is 1 rather than 0. We might think that we could use =
this=20
probability as the dependent variable in an ordinary regression, i.e. as =
a=20
simple linear function of regressors, but we cannot, for two reasons. =
First,=20
numerical regressors may be unlimited in range. If we expressed <I>p</I> =
as a=20
linear function of income, we might then find ourselves predicting that =
<I>p</I>=20
is greater than 1 (which cannot be true, as probabilities can only take =
values=20
between 0 and 1). Second, there is a problem of additivity. Imagine that =
we are=20
trying to predict success at a task from two dichotomous variables, =
training and=20
gender. Among untrained individuals, 50% of men succeed and 70% of =
women. Among=20
trained men, 90% succeed. If we thought of <I>p</I> as a <B>linear</B> =
function=20
of gender and training we would have to estimate the proportion of =
trained women=20
as 70% plus 40% =3D 110% (which again cannot be true). </P>
<P>We get over this problem by making a <B>logistic</B> transformation =
of=20
<I>p</I>, also called taking the <B>logit</B> of <I>p</I>. =
Logit(<I>p</I>) is=20
the log (to base <I>e</I>) of the <B>odds</B> or <B>likelihood ratio</B> =
that=20
the dependent variable is 1. In symbols it is defined as: </P>
<CENTER>
<P>logit(<I>p</I>)=3Dlog(<I>p</I>/(1-<I>p</I>)) </P></CENTER>
<P>Whereas <I>p</I> can only range from 0 to 1, logit(<I>p</I>) ranges =
from=20
negative infinity to positive infinity. The logit scale is symmetrical =
around=20
the logit of 0.5 (which is zero), so the table below only includes a =
couple of=20
negative values. </P><PRE><I>Table 1. The relationship between =
probability of success (p) and logit(p)</I></PRE>
<UL><PRE><I>p</I>         .3    .4    .5    .6    .7    .8    .9    .95  =
 .99
logit(<I>p</I>) -.847 -.405 0.0    .405  .847 1.386 2.197 2.944 =
4.595</PRE></UL>
<P>This table makes it clear that the differences between extreme =
probabilities=20
is spread out; the differences of logits between success rates of .95 =
and .99 is=20
much bigger than that between .5 and .7. In fact the logit scale is=20
approximately linear in the middle range and logarithmic at extreme =
values. </P>
<P>We do not know that the logit scale is the best possible scale but it =
does=20
seem intuitively reasonable. If we consider the example of training and =
gender=20
used above, we can see how it works. On the logit scale, for untrained=20
individuals, the difference of logits between men (success rate 0.50, =
logit 0.0)=20
and women (success rate 0.70, logit 0.847) is 0.847. The success rate =
for=20
trained men is .9 (logit 2.197), so we conclude that training makes a =
difference=20
of logits of 2.197. We therefore predict for trained women a logit of =
2.197 +=20
0.847 =3D 3.044 - which corresponds to a success probability of .955. =
</P>
<P>It follows that logistic regression involves fitting to the data an =
equation=20
of the form: </P>
<CENTER>
<P>logit(<I>p</I>)=3D <I>a</I> + =
<I>b</I><SUB>1</SUB><I>x</I><SUB>1</SUB> +=20
<I>b</I><SUB>2</SUB><I>x</I><SUB>2</SUB> +=20
<I>b</I><SUB>3</SUB><I>x</I><SUB>3</SUB> + ...</P></CENTER>
<P>The meaning of the coefficients <I>b</I><SUB>1</SUB>, =
<I>b</I><SUB>2</SUB>,=20
etc is discussed <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#coeffs">below<=
/A>. </P>
<P>Although logistic regression finds a "best fitting" equation just as =
linear=20
regression does, the principles on which it does so are rather =
different.=20
Instead of using a <B>least-squared deviations</B> criterion for the =
best fit,=20
it uses a <B>maximum likelihood</B> method, which maximises the =
probability of=20
getting the observed results given the fitted regression coefficients. A =

consequence of this is that the goodness of fit and overall significance =

statistics used in logistic regression are different from those used in =
linear=20
regression. </P>
<P><I>back to <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#top">top</A></=
I></P>
<H3><A name=3Dinpractice></A>Logistic regression (and discriminant =
analysis) in=20
practice</H3>
<P>Logistic regression is not available in Minitab but is one of the =
features=20
relatively recently added to SPSS. The advanced statistics manuals for =
SPSS=20
versions 4 onwards describe it well. If you are already familiar with =
the=20
REGRESSION command, LOGISTIC REGRESSION is fairly straightforward to use =
and we=20

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
国产日韩亚洲欧美综合| 99精品视频在线观看免费| 亚洲视频一区在线观看| 国产精品电影院| 国产欧美日产一区| 国产精品五月天| 日本一区二区在线不卡| 欧美国产精品专区| 亚洲日本青草视频在线怡红院| 国产精品久久久久久久裸模| 亚洲蜜臀av乱码久久精品蜜桃| 亚洲青青青在线视频| 玉足女爽爽91| 亚洲国产精品视频| 麻豆精品视频在线观看视频| 麻豆精品一区二区综合av| 黄页网站大全一区二区| 国产99久久久国产精品潘金网站| 国产高清精品久久久久| 91色综合久久久久婷婷| 欧美三级欧美一级| 日韩视频免费直播| 国产精品三级av在线播放| 亚洲精品欧美在线| 麻豆久久一区二区| 成人av电影在线| 亚洲成人黄色影院| 欧美人牲a欧美精品| 青娱乐精品视频| 欧美日本国产一区| 日韩中文字幕一区二区三区| 国产精品女主播av| 国产iv一区二区三区| 欧美精品一区在线观看| 国产尤物一区二区| 久久久国产一区二区三区四区小说| 久久国产尿小便嘘嘘尿| 欧美精品一区二区久久久| 韩日av一区二区| 国产午夜精品一区二区| jiyouzz国产精品久久| 亚洲黄色av一区| 欧美日本国产一区| 久久99久久99小草精品免视看| 精品福利在线导航| 国产成人aaa| 亚洲精选视频免费看| 欧美日韩激情一区二区| 蜜桃av一区二区三区| 久久久久久久久免费| 成人av电影在线| 亚洲成av人片在www色猫咪| 4hu四虎永久在线影院成人| 精品一区二区三区免费观看| 国产网红主播福利一区二区| 色综合色狠狠天天综合色| 亚洲mv在线观看| 久久久三级国产网站| 一本一本久久a久久精品综合麻豆| 亚洲午夜在线视频| 久久久亚洲精华液精华液精华液| 不卡视频在线看| 日韩电影免费一区| 中文字幕精品一区二区三区精品| 欧美在线一二三四区| 轻轻草成人在线| 亚洲欧洲精品成人久久奇米网| 欧美片网站yy| 成人少妇影院yyyy| 日韩中文字幕区一区有砖一区 | 欧美一区二区三区不卡| 国产一区二区电影| 亚洲成av人片www| 久久久久久久久岛国免费| 91高清在线观看| 国产高清不卡一区| 无码av免费一区二区三区试看| 国产色产综合色产在线视频 | 67194成人在线观看| 国产黑丝在线一区二区三区| 亚洲一区二区三区美女| 欧美激情一区不卡| 欧美成人激情免费网| 欧美在线观看视频一区二区三区| 国产在线精品一区二区夜色| 亚洲国产成人va在线观看天堂| 久久精品亚洲一区二区三区浴池| 欧美日韩日日骚| caoporn国产一区二区| 在线观看91视频| 国产裸体歌舞团一区二区| 久久综合视频网| 国产成人超碰人人澡人人澡| 国产精品久久福利| 欧洲精品在线观看| 美女视频黄 久久| 国产欧美日韩三区| 欧美在线你懂的| 国产亚洲婷婷免费| 欧美三区在线观看| 日本精品视频一区二区| 99久久综合色| 成人avav影音| 成人av电影免费在线播放| 国产精品1区2区3区在线观看| 日本成人在线看| 日韩精品三区四区| 五月天婷婷综合| 婷婷一区二区三区| 亚洲一区二区在线免费观看视频| 亚洲日本青草视频在线怡红院| 国产精品视频一二三区| 国产色婷婷亚洲99精品小说| 国产日韩三级在线| 国内精品视频一区二区三区八戒| 精品成人免费观看| 欧洲精品在线观看| 成人国产精品免费网站| 午夜精品福利视频网站| 国产欧美一区在线| 4438成人网| 一本在线高清不卡dvd| 狠狠色丁香婷婷综合| 亚洲18女电影在线观看| 国产精品免费aⅴ片在线观看| 3d动漫精品啪啪一区二区竹菊| 白白色亚洲国产精品| 久久国内精品视频| 亚洲二区在线视频| 亚洲欧洲韩国日本视频| 国产亚洲精品超碰| 欧美一级片免费看| 欧美三级电影一区| 一本一道综合狠狠老| 成人午夜激情在线| 激情五月播播久久久精品| 天天操天天干天天综合网| 亚洲男人天堂av网| 中文字幕中文字幕在线一区 | 一区二区三区在线播放| 国产日本欧洲亚洲| 精品国产乱码久久久久久久久 | 精品日韩一区二区三区| 欧美日韩二区三区| 欧美三级资源在线| 欧美日韩五月天| 色94色欧美sute亚洲线路一久| 北条麻妃国产九九精品视频| 国产精品一区在线| 国产成人亚洲综合a∨猫咪| 极品美女销魂一区二区三区免费| 日韩av一区二区三区| 日韩电影网1区2区| 日本网站在线观看一区二区三区| 亚洲一区二区三区在线播放| 亚洲一区二三区| 丝瓜av网站精品一区二区| 午夜不卡av在线| 日韩**一区毛片| 看电视剧不卡顿的网站| 久久电影国产免费久久电影| 久久福利视频一区二区| 国产一区二区美女诱惑| 国产精品456露脸| 99久久精品免费观看| 91国偷自产一区二区使用方法| 欧美午夜电影网| 欧美电影一区二区三区| 精品美女一区二区| 久久色中文字幕| 中文字幕一区二区三区色视频| 中文字幕日本不卡| 亚洲成av人片一区二区梦乃| 久久99国产精品成人| 国产一区二区在线视频| 国产二区国产一区在线观看| jlzzjlzz欧美大全| 欧美日韩色一区| 精品少妇一区二区三区免费观看| 中文字幕欧美国产| 亚洲国产一区二区三区| 极品少妇一区二区三区精品视频 | 亚洲乱码国产乱码精品精可以看 | 夫妻av一区二区| 在线免费观看成人短视频| 91麻豆精品国产91久久久更新时间 | 韩国视频一区二区| 成人免费va视频| 91麻豆精品国产自产在线 | 欧美日韩一区中文字幕| 精品精品欲导航| 亚洲欧美偷拍三级| 免费成人av在线| 色视频一区二区| 久久亚洲欧美国产精品乐播| 亚洲人成伊人成综合网小说| 麻豆免费看一区二区三区| 99精品欧美一区二区三区综合在线| 8x8x8国产精品| 亚洲女同一区二区|