?? hdda_find_seuil.m
字號:
function prms = hdda_find_seuil(Xl,varargin);% High Dimensionality Discriminant Analysis - This function allows to learn% the dimensionality parameter from the learning dataset. %% Usage: s = hdda_find_seuil(X,'model','AiBiQiDi','n_it',10);%% Author: C. Bouveyron <charles.bouveyron@inrialpes.fr> - 2004-2006% % Reference: C. Bouveyron, S. Girard and C. Schmid, "High Dimensional Discriminant Analysis",% Communications in Statistics, Theory and methods, in press, 2007.% Global parametersmodel = 'AiBiQiDi'; display = 0;[N,p] = size(Xl.data);n_it = 25;n = floor(9 * size(Xl.data,1) / 10);seuils = [1e-3,5e-3,1e-2:1e-2:19e-2,2e-1:5e-2:4e-1];dim_max = p-1;common_d = 0; % PARAMETERS MANAGEMENTvarrem={};for i=1:2:length(varargin) if ~isstr(varargin{i}) | ~exist(varargin{i},'var') varrem = varargin(i:end); end eval([varargin{i} '= varargin{i+1};']);endif ~isempty(strmatch(model,strvcat('AijBiQiD', 'AijBQiD', 'AiBiQiD', 'ABiQiD', 'AiBQiD', 'ABQiD','AjBQD', 'ABQD'),'exact')) common_d = 1; %fprintf('--> the dimensions will be common between classes!\n')end% Finding the optimal "seuil" !fprintf('--> Learning: please wait ');if ~common_d for j=1:length(seuils) fprintf('.'); s = seuils(j); for i=1:n_it [Xtrn.data,Xtrn.cls,Xtst.data,Xtst.cls] = halfsampling(Xl.data,Xl.cls,n); prms = hdda_learn(Xtrn,'model',model,'seuil',s,'display',0); res = hdda_classif(prms,Xtst.data); taux(j,i) = sum(res == Xtst.cls) / size(Xtst.cls,1); end end fprintf('\n'); % Display results tx = mean(taux'); st = std(taux');[val,ind] = max(tx); s_opt = seuils(ind); fprintf('--> Optimal threshold: %g\n',s_opt); % Draw results if display, figure, plot(seuils,tx,'*'), hold on, plot(seuils(ind),tx(ind),'ro'), axis([0 seuils(end) min(tx)-(1-max(tx)) 1]) for i=1:length(seuils), plot([seuils(i),seuils(i)],[tx(i)-st(i)/2,tx(i)+st(i)/2],':+'), end end % Learn the classifier prms = hdda_learn(Xl,'model',model,'seuil',s_opt,'display',0) % Finding the optimal dimension ! else dims = [1:dim_max]; for d=1:dim_max fprintf('.'); for i=1:n_it [Xtrn.data,Xtrn.cls,Xtst.data,Xtst.cls] = halfsampling(Xl.data,Xl.cls,n); prms = hdda_learn(Xtrn,'model',model,'dim',d,'display',0); res = hdda_classif(prms,Xtst.data); taux(d,i) = sum(res == Xtst.cls) / size(Xtst.cls,1); end end fprintf('\n'); % Display results tx = mean(taux'); st = std(taux');[val,ind] = max(tx); d_opt = dims(ind); fprintf('--> Optimal dimension: %g\n',d_opt); % Draw results if display, figure, plot(tx,'-*'), hold on, plot(dims(ind),tx(ind),'ro'), axis([0 dim_max+1 min(tx)-(1-max(tx)) 1]) for i=1:dim_max, plot([i,i],[tx(i)-st(i)/2,tx(i)+st(i)/2],':+'), end end % Learn the classifier prms = hdda_learn(Xl,'model',model,'dim',dim_opt,'display',0)end%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%function [L,cls_l,T,cls_t] = halfsampling(X,cls,n);% [L,cls_l,T,cls_t] = halfsampling(X,cls,n);[N,p] = size(X);ind = randperm(N);L = X(ind(1:n),:); cls_l = cls(ind(1:n));T = X(ind(n+1:end),:); cls_t = cls(ind(n+1:end));
?? 快捷鍵說明
復制代碼
Ctrl + C
搜索代碼
Ctrl + F
全屏模式
F11
切換主題
Ctrl + Shift + D
顯示快捷鍵
?
增大字號
Ctrl + =
減小字號
Ctrl + -