亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來(lái)到蟲(chóng)蟲(chóng)下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲(chóng)蟲(chóng)下載站

?? psy6003 logistic regression and discriminant analysis.mht

?? 介紹各種經(jīng)典算法的代碼。說(shuō)明詳細(xì)
?? MHT
?? 第 1 頁(yè) / 共 3 頁(yè)
字號(hào):
From: <由 Microsoft Internet Explorer 5 保存>
Subject: PSY6003: Logistic regression and discriminant analysis
Date: Sun, 20 Aug 2000 17:13:15 +0800
MIME-Version: 1.0
Content-Type: multipart/related;
	boundary="----=_NextPart_000_000F_01C00AC9.ED4B1940";
	type="text/html"
X-MimeOLE: Produced By Microsoft MimeOLE V5.00.2615.200

This is a multi-part message in MIME format.

------=_NextPart_000_000F_01C00AC9.ED4B1940
Content-Type: text/html;
	charset="gb2312"
Content-Transfer-Encoding: quoted-printable
Content-Location: http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<HTML><HEAD><TITLE>PSY6003: Logistic regression and discriminant =
analysis</TITLE>
<META content=3D"text/html; charset=3Dgb2312" http-equiv=3DContent-Type>
<META content=3D"MSHTML 5.00.2614.3500" name=3DGENERATOR></HEAD>
<BODY background=3Dhttp://www.ex.ac.uk/Psychology/images/background.gif>
<CENTER>
<P><!--replace the message below with the title of your page--><!-- page =
header (standard department page header) --><A=20
href=3D"http://www.exeter.ac.uk/"><IMG alt=3D"University of Exeter" =
height=3D59=20
src=3D"http://www.exeter.ac.uk/uoelogos.gif" width=3D70></A> =
</P></CENTER>
<H4 align=3Dcenter><A =
href=3D"http://www.ex.ac.uk/Psychology/">DEPARTMENT OF=20
PSYCHOLOGY</A></H4>
<H2 align=3Dcenter>
<HR SIZE=3D6>
<!-- replace the message below with the title of your page--></H2>
<H2 align=3Dcenter>PSY6003 Advanced statistics: <A=20
href=3D"http://info.ex.ac.uk/~SEGLea/multvar2">Multivariate analysis II: =
Manifest=20
variables analyses</A></H2>
<H1 align=3Dcenter><A name=3Dtop></A>Topic 4: Logistic regression and =
discriminant=20
analysis</H1>
<P>
<HR SIZE=3D6>
<!-- end of page header -->Contents of this handout: <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#problem">The =
problem of=20
dichotomous dependent variables</A>; <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#discrim">Discr=
iminant=20
analysis</A>; <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#logtheory">Log=
istic=20
regression - theory</A>; Logistic regression (and discriminant analysis) =
<A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#inpractice">in=
=20
practice</A>; <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#report">Interp=
reting=20
and reporting</A> logistic regression results; <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#refs">Referenc=
es=20
</A>and further reading; <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#examples">Exam=
ples</A>.
<P></P>
<H3><A name=3Dproblem></A>The Problem: Categorical dependent =
variables</H3>
<P>A limitation of ordinary linear models is the requirement that the =
dependent=20
variable is numerical rather than categorical. But many interesting =
variables=20
are categorical - patients may live or die, people may pass or fail MScs =
and so=20
on. A range of techniques have been developed for analysing data with=20
categorical dependent variables, including <B>discriminant analysis, =
probit=20
analysis, log-linear regression </B>and<B> logistic regression</B>. To =
contrast=20
it with these, the kind of regression we have used so far is usually =
referred to=20
as <B>linear regression</B>. </P>
<P>The various techniques listed above are applicable in different =
situations:=20
for example log-linear regression require all regressors to be =
categorical,=20
whilst discriminant analysis strictly require them all to be continuous =
(though=20
dummy variables can be used as for multiple regression). In SPSS at =
least,=20
logistic regression is easier to use than discriminant analysis when we =
have a=20
mixture of numerical and categorical regressors, because it includes =
procedures=20
for generating the necessary dummy variables automatically. </P>
<P><I>back to <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#top">top</A></=
I></P>
<H3><A name=3Ddiscrim></A>Discriminant analysis</H3>
<P>The major purpose of discriminant analysis is to predict membership =
in two or=20
more mutually exclusive groups from a set of predictors, when there is =
no=20
natural ordering on the groups. So we may ask whether we can predict =
whether=20
people vote Labour or Conservative from a knowledge of their age, their =
class,=20
attitudes, values etc etc. </P>
<P>Discriminant analysis is just the inverse of a one-way <B>MANOVA</B>, =
the=20
multivariate analysis of variance. The levels of the independent =
variable (or=20
factor) for Manova become the categories of the dependent variable for=20
discriminant analysis, and the dependent variables of the Manova become =
the=20
predictors for discriminant analysis. In MANOVA we ask whether group =
membership=20
produces reliable differences on a combination of dependent variables. =
If the=20
answer to that question is 'yes' then clearly that combination of =
variables can=20
be used to predict group membership. Mathematically, MANOVA and =
discriminant=20
analysis are the same; indeed, the SPSS MANOVA command can be used to =
print out=20
the <B>discriminant</B> <B>functions</B> that are at the heart of =
discriminant=20
analysis, though this is not usually the easiest way of obtaining them. =
These=20
discriminant functions are the linear combinations of the =
<B>standardised</B>=20
independent variables which yield the biggest mean differences between =
the=20
groups. If the dependent variable is a <B>dichotomy</B>, there is one=20
discriminant function; if there are <I>k</I> levels of the dependent =
variable,=20
up to <I>k</I>-1 discriminant functions can be extracted, and we can =
test how=20
many it is worth extracting. Successive discriminant functions are=20
<B>orthogonal</B> to one another, like <B>principal</B> =
<B>components</B>, but=20
they are not the same as the principal components you would obtain if =
you just=20
did a principal components analysis on the independent variables, =
because they=20
are constructed to maximise the differences between the values of the =
dependent=20
variable. </P>
<P>The commonest use of discriminant analysis is where there are just =
two=20
categories in the dependent variable; but as we have seen, it can be =
used for=20
multi-way categories (just as MANOVA can be used to test the =
significance of=20
differences between several groups, not just two). This is an advantage =
over=20
logistic regression, which is always described for the problem of a =
dichotomous=20
dependent variable. </P>
<P>You will encounter discriminant analysis fairly often in journals. =
But it is=20
now being replaced with logistic regression, as this approach requires =
fewer=20
assumptions in theory, is more statistically robust in practice, and is =
easier=20
to use and understand than discriminant analysis. So we will concentrate =
on=20
logistic regression. </P>
<P><I>back to <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#top">top</A></=
I></P>
<H3><A name=3Dlogtheory></A>Logistic regression: theory</H3>
<P>Just like linear regression, logistic regression gives each regressor =
a=20
coefficient <I>b</I><SUB>1</SUB> which measures the regressor's =
independent=20
contribution to variations in the dependent variable. But there are =
technical=20
problems with dependent variables that can only take values of 0 and 1. =
What we=20
want to predict from a knowledge of relevant independent variables is =
not a=20
precise numerical value of a dependent variable, but rather the =
probability=20
(<I>p</I>) that it is 1 rather than 0. We might think that we could use =
this=20
probability as the dependent variable in an ordinary regression, i.e. as =
a=20
simple linear function of regressors, but we cannot, for two reasons. =
First,=20
numerical regressors may be unlimited in range. If we expressed <I>p</I> =
as a=20
linear function of income, we might then find ourselves predicting that =
<I>p</I>=20
is greater than 1 (which cannot be true, as probabilities can only take =
values=20
between 0 and 1). Second, there is a problem of additivity. Imagine that =
we are=20
trying to predict success at a task from two dichotomous variables, =
training and=20
gender. Among untrained individuals, 50% of men succeed and 70% of =
women. Among=20
trained men, 90% succeed. If we thought of <I>p</I> as a <B>linear</B> =
function=20
of gender and training we would have to estimate the proportion of =
trained women=20
as 70% plus 40% =3D 110% (which again cannot be true). </P>
<P>We get over this problem by making a <B>logistic</B> transformation =
of=20
<I>p</I>, also called taking the <B>logit</B> of <I>p</I>. =
Logit(<I>p</I>) is=20
the log (to base <I>e</I>) of the <B>odds</B> or <B>likelihood ratio</B> =
that=20
the dependent variable is 1. In symbols it is defined as: </P>
<CENTER>
<P>logit(<I>p</I>)=3Dlog(<I>p</I>/(1-<I>p</I>)) </P></CENTER>
<P>Whereas <I>p</I> can only range from 0 to 1, logit(<I>p</I>) ranges =
from=20
negative infinity to positive infinity. The logit scale is symmetrical =
around=20
the logit of 0.5 (which is zero), so the table below only includes a =
couple of=20
negative values. </P><PRE><I>Table 1. The relationship between =
probability of success (p) and logit(p)</I></PRE>
<UL><PRE><I>p</I>         .3    .4    .5    .6    .7    .8    .9    .95  =
 .99
logit(<I>p</I>) -.847 -.405 0.0    .405  .847 1.386 2.197 2.944 =
4.595</PRE></UL>
<P>This table makes it clear that the differences between extreme =
probabilities=20
is spread out; the differences of logits between success rates of .95 =
and .99 is=20
much bigger than that between .5 and .7. In fact the logit scale is=20
approximately linear in the middle range and logarithmic at extreme =
values. </P>
<P>We do not know that the logit scale is the best possible scale but it =
does=20
seem intuitively reasonable. If we consider the example of training and =
gender=20
used above, we can see how it works. On the logit scale, for untrained=20
individuals, the difference of logits between men (success rate 0.50, =
logit 0.0)=20
and women (success rate 0.70, logit 0.847) is 0.847. The success rate =
for=20
trained men is .9 (logit 2.197), so we conclude that training makes a =
difference=20
of logits of 2.197. We therefore predict for trained women a logit of =
2.197 +=20
0.847 =3D 3.044 - which corresponds to a success probability of .955. =
</P>
<P>It follows that logistic regression involves fitting to the data an =
equation=20
of the form: </P>
<CENTER>
<P>logit(<I>p</I>)=3D <I>a</I> + =
<I>b</I><SUB>1</SUB><I>x</I><SUB>1</SUB> +=20
<I>b</I><SUB>2</SUB><I>x</I><SUB>2</SUB> +=20
<I>b</I><SUB>3</SUB><I>x</I><SUB>3</SUB> + ...</P></CENTER>
<P>The meaning of the coefficients <I>b</I><SUB>1</SUB>, =
<I>b</I><SUB>2</SUB>,=20
etc is discussed <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#coeffs">below<=
/A>. </P>
<P>Although logistic regression finds a "best fitting" equation just as =
linear=20
regression does, the principles on which it does so are rather =
different.=20
Instead of using a <B>least-squared deviations</B> criterion for the =
best fit,=20
it uses a <B>maximum likelihood</B> method, which maximises the =
probability of=20
getting the observed results given the fitted regression coefficients. A =

consequence of this is that the goodness of fit and overall significance =

statistics used in logistic regression are different from those used in =
linear=20
regression. </P>
<P><I>back to <A=20
href=3D"http://www.ex.ac.uk/~SEGLea/multvar2/disclogi.html#top">top</A></=
I></P>
<H3><A name=3Dinpractice></A>Logistic regression (and discriminant =
analysis) in=20
practice</H3>
<P>Logistic regression is not available in Minitab but is one of the =
features=20
relatively recently added to SPSS. The advanced statistics manuals for =
SPSS=20
versions 4 onwards describe it well. If you are already familiar with =
the=20
REGRESSION command, LOGISTIC REGRESSION is fairly straightforward to use =
and we=20

?? 快捷鍵說(shuō)明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號(hào) Ctrl + =
減小字號(hào) Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
国产精品不卡在线观看| 国产精品88av| 国产麻豆精品在线观看| 91在线丨porny丨国产| 欧美一区二区三区四区久久| 国产精品美女一区二区三区| 蜜桃视频第一区免费观看| av在线不卡观看免费观看| 欧美成人a∨高清免费观看| 亚洲欧洲另类国产综合| 国产一区二区三区观看| 91麻豆精品国产综合久久久久久| 亚洲欧洲国产专区| 国产suv精品一区二区6| 精品少妇一区二区三区日产乱码| 亚洲成人av在线电影| 99热在这里有精品免费| 欧美激情一区二区三区不卡| 久久99国内精品| 欧美精品一区二区三区视频| 91色在线porny| 久久综合成人精品亚洲另类欧美 | 免费观看久久久4p| 色噜噜狠狠成人网p站| 中日韩免费视频中文字幕| 国产一区二区在线影院| 精品久久久久久亚洲综合网| 日日噜噜夜夜狠狠视频欧美人| 91免费观看国产| 亚洲欧洲制服丝袜| 91小视频在线免费看| 亚洲欧美福利一区二区| 91首页免费视频| 中文字幕一区不卡| 91小视频免费看| 亚洲蜜臀av乱码久久精品| 91影院在线免费观看| 亚洲欧美日韩在线| 欧美视频精品在线观看| 婷婷成人激情在线网| 91精品久久久久久久91蜜桃 | 亚洲少妇最新在线视频| av亚洲精华国产精华精华| 中文字幕在线不卡视频| 91免费看片在线观看| 亚洲免费在线电影| 欧美午夜精品理论片a级按摩| 一区二区日韩av| 欧美美女一区二区三区| 欧美aaa在线| 精品国产免费视频| 国产成人av电影在线| 亚洲欧洲www| 91国偷自产一区二区三区观看| 亚洲bdsm女犯bdsm网站| 日韩欧美高清一区| 国产69精品久久久久毛片| 国产精品久久久爽爽爽麻豆色哟哟 | 欧美片网站yy| 久久97超碰色| 国产精品久久久久影院| 欧美日韩三级视频| 韩国欧美一区二区| 亚洲视频精选在线| 日韩一级片网址| 成人精品视频一区二区三区| 亚洲自拍偷拍综合| 2024国产精品| 91色porny蝌蚪| 久久99久久99| 又紧又大又爽精品一区二区| 欧美xxxx在线观看| 日本韩国精品一区二区在线观看| 日韩成人伦理电影在线观看| 亚洲国产高清不卡| 91精品麻豆日日躁夜夜躁| 国产xxx精品视频大全| 天天综合日日夜夜精品| 中文在线一区二区| 欧美一区二区久久| 色综合夜色一区| 狠狠色综合色综合网络| 亚洲国产裸拍裸体视频在线观看乱了| 欧美精品一区二区三区蜜桃视频| 91丝袜美腿高跟国产极品老师| 麻豆成人91精品二区三区| 一区二区视频免费在线观看| 国产午夜一区二区三区| 欧美久久久久久蜜桃| 99精品视频一区二区| 激情综合色综合久久| 午夜av区久久| 亚洲少妇中出一区| 日本一区二区三区四区| 欧美一级艳片视频免费观看| 色婷婷久久久亚洲一区二区三区| 丁香亚洲综合激情啪啪综合| 美女视频黄免费的久久| 性做久久久久久免费观看| 日韩伦理免费电影| 国产精品午夜电影| 久久精品人人做| 精品国产青草久久久久福利| 欧美一区二区三区四区视频| 欧美日韩国产在线观看| 色婷婷av一区二区三区gif| 99视频在线精品| 国产aⅴ综合色| 国产米奇在线777精品观看| 蜜桃视频第一区免费观看| 三级一区在线视频先锋| 日韩中文字幕av电影| 午夜影院在线观看欧美| 午夜精品一区在线观看| 亚洲一区二区三区国产| 亚洲免费观看在线观看| 亚洲精品国产第一综合99久久| 综合在线观看色| 亚洲精品乱码久久久久久黑人 | 精品国产免费视频| 精品日韩欧美在线| 精品国产91久久久久久久妲己| 日韩你懂的在线观看| 26uuu亚洲综合色| 精品对白一区国产伦| 久久理论电影网| 国产人伦精品一区二区| 国产精品婷婷午夜在线观看| 国产精品护士白丝一区av| 日韩一区有码在线| 亚洲一区中文在线| 日韩激情视频在线观看| 在线日韩国产精品| 欧美精选一区二区| 精品国精品自拍自在线| 国产欧美一区二区精品性色| 国产精品国产成人国产三级 | 免费av网站大全久久| 看片的网站亚洲| 国产高清久久久| 色综合一区二区三区| 7878成人国产在线观看| 亚洲第一成人在线| 视频在线在亚洲| 国产激情一区二区三区桃花岛亚洲| 国产91露脸合集magnet| 91国在线观看| 欧美va亚洲va在线观看蝴蝶网| 日本一区二区三级电影在线观看 | av色综合久久天堂av综合| 欧美专区日韩专区| 久久久影视传媒| 亚洲一区二区三区四区在线观看| 久久99精品一区二区三区| 91免费视频观看| 精品国产凹凸成av人导航| 一区二区三区91| 国产精品中文欧美| 欧洲精品在线观看| 久久精品欧美一区二区三区不卡 | 色综合久久久久综合99| 欧美电视剧免费观看| 国产精品久久久久久久久免费樱桃| 亚洲高清在线视频| 成人av动漫网站| 日韩欧美成人激情| 一区二区三区欧美| 国产不卡一区视频| 6080午夜不卡| 亚洲激情av在线| 成人性生交大合| 欧美哺乳videos| 亚洲成人av电影| av中文字幕在线不卡| 精品国产乱码久久久久久牛牛| 亚洲一二三区不卡| 91美女视频网站| 久久精品亚洲精品国产欧美| 石原莉奈一区二区三区在线观看| 91网站视频在线观看| 国产日韩欧美a| 极品销魂美女一区二区三区| 欧美日本乱大交xxxxx| 一区二区三区在线免费视频 | 欧美日韩免费一区二区三区| 国产精品色在线| 国产成人亚洲综合a∨婷婷图片| 宅男在线国产精品| 香蕉乱码成人久久天堂爱免费| 99精品久久只有精品| 中文一区一区三区高中清不卡| 国内成人免费视频| 欧美va日韩va| 麻豆国产欧美一区二区三区| 欧美男女性生活在线直播观看| 亚洲蜜臀av乱码久久精品蜜桃| 91视频精品在这里| 亚洲日本护士毛茸茸| 91蜜桃在线观看| 日韩理论在线观看|