亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? marq.m

?? matlab實(shí)現(xiàn)神經(jīng)網(wǎng)絡(luò)程序集合
?? M
字號:
function [W1,W2,PI_vector,iteration,lambda]=marq(NetDef,W1,W2,PHI,Y,trparms)
%  MARQ
%  ----
%              Train a two layer neural network with the Levenberg-Marquardt
%              method.
%
%          If desired, it is possible to use regularization by
%          weight decay. Also pruned (ie. not fully connected) networks can
%          be trained.
%
%          Given a set of corresponding input-output pairs and an initial
%          network,
%          [W1,W2,critvec,iteration,lambda]=marq(NetDef,W1,W2,PHI,Y,trparms)
%          trains the network with the Levenberg-Marquardt method.
% 
%          The activation functions can be either linear or tanh. The
%          network architecture is defined by the matrix 'NetDef' which
%          has two rows. The first row specifies the hidden layer and the
%          second row specifies the output layer.
% 
%          E.g.:    NetDef = ['LHHHH' 
%                             'LL---']
%
%          (L = Linear, H = tanh)
%
%          A weight is pruned by setting it to zero.
%
%          The Marquardt method is described in:
%          K. Madsen: 'Optimering' (Haefte 38), IMM, DTU, 1991
%  
%          Notice that the bias is included as the last column in the weight
%          matrices.
%
% 
%  INPUT:
%  NetDef: Network definition 
%  W1    : Input-to-hidden layer weights. The matrix dimension is
%          dim(W1) = [(# of hidden units) * (inputs + 1)] (the 1 is due to the bias)
%  W2    : hidden-to-output layer weights.
%           dim(W2) = [(outputs)  *  (# of hidden units + 1)]
%  PHI   : Input vector. dim(PHI) = [(inputs)  *  (# of data)]
%  Y     : Output data. dim(Y) = [(outputs)  * (# of data)]
%  trparms : Vector containing parameters associated with the training
%            trparms = [max_iter stop_crit lambda D]
%             max_iter  : max # of iterations.
%             stop_crit : Stop training if criterion is below this value
%             lambda    : Initial Levenberg-Marquardt parameter
%             D         : Row vector containing the weight decay parameters.
%                            If D has one element a scalar weight decay will be
%                            used. If D has two elements the first element will
%                            be used as weight decay for the hidden-to-output
%                            layer while second will be used for the input-to
%                            hidden layer weights. For individual weight decays,
%                            D must contain as many elements as there are
%                            weights in the network.
%
%           Default values are (obtained if left out): trparms = [500 0 1 0] 
% 
% 
%  OUTPUT:
%  W1, W2   : Weight matrices after training
%  critvec:   Vector containing the criterion evaluated at each iteration
%  iteration: # of iterations
%  lambda   : The final value of lambda. Relevant only if retraining is desired
% 
%  Programmed by : Magnus Norgaard, IAU/IMM
%  LastEditDate  : July 16, 1996


%----------------------------------------------------------------------------------
%--------------             NETWORK INITIALIZATIONS                   -------------
%----------------------------------------------------------------------------------
[outputs,N] = size(Y);                  % # of outputs and # of data
[hidden,inputs] = size(W1);             % # of hidden units 
inputs=inputs-1;                        % # of inputs
L_hidden = find(NetDef(1,:)=='L')';     % Location of linear hidden neurons
H_hidden = find(NetDef(1,:)=='H')';     % Location of tanh hidden neurons
L_output = find(NetDef(2,:)=='L')';     % Location of linear output neurons
H_output = find(NetDef(2,:)=='H')';     % Location of tanh output neurons
y1       = [zeros(hidden,N);ones(1,N)]; % Hidden layer outputs
y2       = zeros(outputs,N);            % Network output
index = outputs*(hidden+1) + 1 + [0:hidden-1]*(inputs+1); % A useful vector!
index2 = (0:N-1)*outputs;               % Yet another useful vector
iteration = 1;                          % Counter variable
dw       = 1;                           % Flag telling that the weights are new
PHI      = [PHI;ones(1,N)];             % Augment PHI with a row containg ones
parameters1= hidden*(inputs+1);         % # of input-to-hidden weights
parameters2= outputs*(hidden+1);        % # of hidden-to-output weights
parameters = parameters1 + parameters2; % Total # of weights
PSI      = zeros(parameters,outputs*N); % Deriv. of each output w.r.t. each weight
ones_h   = ones(hidden+1,1);            % A vector of ones
ones_i   = ones(inputs+1,1);            % Another vector of ones
                                        % Parameter vector containing all weights
theta = [reshape(W2',parameters2,1) ; reshape(W1',parameters1,1)];
theta_index = find(theta);              % Index to weights<>0
theta_red = theta(theta_index);         % Reduced parameter vector
reduced  = length(theta_index);         % The # of parameters in theta_red
index3   = 1:(reduced+1):(reduced^2);   % A third useful vector
if ~exist('trparms')                    % Default training parameters
  max_iter  = 500;
  stop_crit = 0;
  lambda    = 1;
  D         = 0;
else                                    % User specified values
  max_iter  = trparms(1);
  stop_crit = trparms(2);
  lambda    = trparms(3);
  if length(trparms)==4,                % Scalar weight decay parameter
    D = trparms(4*ones(1,reduced))';      
  elseif length(trparms)==5,            % Two weight decay parameters
    D = trparms([4*ones(1,parameters2) 5*ones(1,parameters1)])';
    D = D(theta_index);
  elseif length(trparms)>5,             % Individual weight decay
    D = trparms(4:length(trparms))';
  end
end
PI_vector = zeros(max_iter,1);          % A vector containing the accumulated SSE


%----------------------------------------------------------------------------------
%--------------                   TRAIN NETWORK                       -------------
%----------------------------------------------------------------------------------
clc;
c=fix(clock);
fprintf('Network training started at %2i.%2i.%2i\n\n',c(4),c(5),c(6));


% >>>>>>>>>>>>>>>>>>>>>  COMPUTE NETWORK OUTPUT  y2(theta)   <<<<<<<<<<<<<<<<<<<<<<
h1 = W1*PHI;  
y1(H_hidden,:) = pmntanh(h1(H_hidden,:));
y1(L_hidden,:) = h1(L_hidden,:);    

h2 = W2*y1;
y2(H_output,:) = pmntanh(h2(H_output,:));
y2(L_output,:) = h2(L_output,:);

E        = Y - y2;                      % Training error
E_vector = E(:);                        % Reshape E into a long vector
SSE      = E_vector'*E_vector;          % Sum of squared errors (SSE)
PI       = (SSE+theta_red'*(D.*theta_red))/(2*N); % Performance index

while iteration<=max_iter    
if dw==1,
% >>>>>>>>>>>>>>>>>>>>>>>>>>>   COMPUTE THE PSI MATRIX   <<<<<<<<<<<<<<<<<<<<<<<<<<
% (The derivative of each network output (y2) with respect to each weight)

    % ==========   Elements corresponding to the linear output units   ============
    for i = L_output'
      index1 = (i-1) * (hidden + 1) + 1;

      % -- The part of PSI corresponding to hidden-to-output layer weights --
      PSI(index1:index1+hidden,index2+i) = y1;
      % ---------------------------------------------------------------------
 
      % -- The part of PSI corresponding to input-to-hidden layer weights ---
      for j = L_hidden',
        PSI(index(j):index(j)+inputs,index2+i) = W2(i,j)*PHI;
      end

      for j = H_hidden',
        tmp = W2(i,j)*(1-y1(j,:).*y1(j,:)); 
        PSI(index(j):index(j)+inputs,index2+i) = tmp(ones_i,:).*PHI;
      end 
      % ---------------------------------------------------------------------    
    end

    % ============  Elements corresponding to the tanh output units   =============
    for i = H_output',
      index1 = (i-1) * (hidden + 1) + 1;

      % -- The part of PSI corresponding to hidden-to-output layer weights --
      tmp = 1 - y2(i,:).*y2(i,:);
      PSI(index1:index1+hidden,index2+i) = y1.*tmp(ones_h,:);
      % ---------------------------------------------------------------------
     
      % -- The part of PSI corresponding to input-to-hidden layer weights ---
      for j = L_hidden',
        tmp = W2(i,j)*(1-y2(i,:).*y2(i,:));
        PSI(index(j):index(j)+inputs,index2+i) = tmp(ones_i,:).*PHI;
      end
      
      for j = H_hidden',
        tmp  = W2(i,j)*(1-y1(j,:).*y1(j,:));
        tmp2 = (1-y2(i,:).*y2(i,:));
        PSI(index(j):index(j)+inputs,index2+i) = tmp(ones_i,:)...
                                                  .*tmp2(ones_i,:).*PHI;
      end
      % ---------------------------------------------------------------------
    end
    PSI_red = PSI(theta_index,:);
    
    % -- Gradient --
    G = PSI_red*E_vector-D.*theta_red;

    % -- Means square error part Hessian  --
    R = PSI_red*PSI_red';
    
    dw = 0;
  end
  
   
% >>>>>>>>>>>>>>>>>>>>>>>>>>>        COMPUTE h_k        <<<<<<<<<<<<<<<<<<<<<<<<<<<
  % -- Hessian  --
  H = R;
  H(index3) = H(index3)'+lambda+D;                  % Add diagonal matrix

  % -- Search direction --
  h = H\G;                                          % Solve for search direction

  % -- Compute 'apriori' iterate --
  theta_red_new = theta_red + h;                    % Update parameter vector
  theta(theta_index) = theta_red_new;

  % -- Put the parameters back into the weight matrices --
  W1_new = reshape(theta(parameters2+1:parameters),inputs+1,hidden)';
  W2_new = reshape(theta(1:parameters2),hidden+1,outputs)';

    
% >>>>>>>>>>>>>>>>>>>>   COMPUTE NETWORK OUTPUT  y2(theta+h)   <<<<<<<<<<<<<<<<<<<<
  h1 = W1_new*PHI;  
  y1(H_hidden,:) = pmntanh(h1(H_hidden,:));
  y1(L_hidden,:) = h1(L_hidden,:);
    
  h2 = W2_new*y1;
  y2(H_output,:) = pmntanh(h2(H_output,:));
  y2(L_output,:) = h2(L_output,:);

  E_new        = Y - y2;                 % Training error
  E_new_vector = E_new(:);               % Reshape E into a long vector
  SSE_new  = E_new_vector'*E_new_vector; % Sum of squared errors (SSE)
  PI_new   = (SSE_new + theta_red_new'*(D.*theta_red_new))/(2*N); % PI


% >>>>>>>>>>>>>>>>>>>>>>>>>>>       UPDATE  lambda     <<<<<<<<<<<<<<<<<<<<<<<<<<<<
  L = h'*G + h'*(h.*(D+lambda));

  % Decrease lambda if SSE has fallen 'sufficiently'
  if 2*N*(PI - PI_new) > (0.75*L),
    lambda = lambda/2;
  
  % Increase lambda if SSE has grown 'sufficiently'
  elseif 2*N*(PI-PI_new) <= (0.25*L),
    lambda = 2*lambda;
  end


% >>>>>>>>>>>>>>>>>>>>       UPDATES FOR NEXT ITERATION        <<<<<<<<<<<<<<<<<<<<
  % Update only if criterion has decreased
  if PI_new < PI,                      
    W1 = W1_new;
    W2 = W2_new;
    theta_red = theta_red_new;
    E_vector = E_new_vector;
    PI = PI_new;
    dw = 1;
    iteration = iteration + 1;
    PI_vector(iteration-1) = PI;                             % Collect PI in vector
    fprintf('iteration # %i   PI = %4.3e\r',iteration-1,PI); % Print on-line inform
  end

  % Check if stop condition is fulfilled
  if (PI < stop_crit) | (lambda>1e7), break, end             
end
%----------------------------------------------------------------------------------
%--------------              END OF NETWORK TRAINING                  -------------
%----------------------------------------------------------------------------------
PI_vector = PI_vector(1:iteration-1);
c=fix(clock);
fprintf('\n\nNetwork training ended at %2i.%2i.%2i\n',c(4),c(5),c(6));

?? 快捷鍵說明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
日韩一区二区三| 秋霞午夜鲁丝一区二区老狼| 国产综合色精品一区二区三区| 69久久夜色精品国产69蝌蚪网| 日韩国产精品91| 欧美电影免费观看高清完整版在| 久久成人免费电影| 国产亚洲精品福利| 国产99久久久国产精品潘金网站| 欧美国产国产综合| 欧美中文字幕亚洲一区二区va在线| 亚洲最新在线观看| 日韩精品中文字幕一区二区三区| 国产一区二区影院| 亚洲精品高清在线| 欧美一区二区精品| 高清国产一区二区| 亚洲精品乱码久久久久久黑人| 欧美色手机在线观看| 久久66热re国产| 国产精品成人免费| 欧美一级日韩一级| 国产盗摄一区二区| 亚洲一区二区三区四区在线观看| 日韩一区二区中文字幕| 成人激情午夜影院| 日日夜夜免费精品视频| 国产欧美1区2区3区| 欧美性大战久久| 国产一区二区三区在线观看免费 | 亚洲欧美在线高清| 欧美日韩mp4| 成人福利电影精品一区二区在线观看 | 韩国成人在线视频| 亚洲色图欧洲色图婷婷| 欧美成人三级在线| 色国产综合视频| 国产一区在线精品| 偷拍与自拍一区| 最新久久zyz资源站| 日韩情涩欧美日韩视频| 色哟哟一区二区在线观看| 看电影不卡的网站| 亚洲国产精品久久一线不卡| 国产午夜精品一区二区三区四区 | 日韩一区二区三区视频| aa级大片欧美| 国产一区二区女| 五月激情六月综合| 亚洲精品国产精华液| 国产女同互慰高潮91漫画| 欧美一区二区观看视频| 在线观看日韩高清av| av资源网一区| 国产精品1区2区3区| 蜜臀久久99精品久久久久久9 | 蜜桃视频一区二区| 亚洲影院在线观看| 中文字幕一区二区三区在线不卡 | 色8久久精品久久久久久蜜| 国产一区二区三区香蕉| 日韩激情中文字幕| 亚洲一区二区三区小说| 亚洲精品日日夜夜| 亚洲人成精品久久久久久| 国产视频一区二区在线观看| 精品99久久久久久| 欧美mv日韩mv国产网站app| 欧美日韩你懂得| 欧美日韩一区视频| 欧美人与性动xxxx| 制服丝袜亚洲色图| 91精品欧美综合在线观看最新| 欧美性色黄大片| 欧美人妖巨大在线| 337p亚洲精品色噜噜噜| 91精品国产乱| 精品国产一二三区| 久久一夜天堂av一区二区三区| 精品国产乱码久久久久久影片| 制服丝袜中文字幕亚洲| 精品少妇一区二区三区日产乱码| 日韩精品在线一区二区| 久久婷婷色综合| 亚洲国产精品传媒在线观看| 国产免费观看久久| 亚洲欧洲成人自拍| 亚洲综合激情另类小说区| 偷拍自拍另类欧美| 国产一区二区在线影院| 成人ar影院免费观看视频| 色欧美日韩亚洲| 欧美美女bb生活片| 欧美xxxxx裸体时装秀| 国产三级三级三级精品8ⅰ区| 国产清纯在线一区二区www| 国产精品久久久久久久久晋中 | 色94色欧美sute亚洲线路一久| 欧美在线观看视频一区二区| 欧美美女直播网站| xfplay精品久久| 中文字幕一区二区三区蜜月| 一区二区在线观看av| 日韩电影免费在线| 国产乱一区二区| 91色porny蝌蚪| 91精品国产综合久久精品| 久久综合中文字幕| 成人免费一区二区三区在线观看| 亚洲图片一区二区| 国产麻豆一精品一av一免费| 91欧美一区二区| 欧美成人国产一区二区| 中文字幕一区不卡| 美腿丝袜在线亚洲一区| www.亚洲色图| 日韩欧美一区二区在线视频| 欧美国产日本视频| 五月激情综合色| 成人精品gif动图一区| 欧美日韩国产乱码电影| 日本一区二区免费在线| 天天综合色天天| 99视频超级精品| 精品国产精品一区二区夜夜嗨| 亚洲三级电影网站| 国产一区二三区| 欧美一区二区三区在线视频| 国产精品入口麻豆九色| 美女尤物国产一区| 91福利资源站| 日本一区二区三区久久久久久久久不| 亚洲va韩国va欧美va| av不卡在线播放| 国产视频亚洲色图| 激情都市一区二区| 51精品视频一区二区三区| 亚洲综合在线五月| 91在线云播放| 国产日韩成人精品| 久久97超碰国产精品超碰| 91精品国产综合久久久久| 亚洲美女在线国产| 不卡av在线网| 国产欧美日韩综合| 国产一区二区三区在线观看免费视频| 欧美一区二区精品久久911| 亚洲在线中文字幕| 在线观看www91| 亚洲免费av高清| 91老师片黄在线观看| 国产精品热久久久久夜色精品三区| 久久国产日韩欧美精品| 91超碰这里只有精品国产| 亚洲高清视频在线| 欧美亚洲综合久久| 亚洲激情自拍偷拍| 色综合色综合色综合色综合色综合| 国产色产综合色产在线视频| 国产久卡久卡久卡久卡视频精品| 日韩欧美一区在线观看| 免费日本视频一区| 欧美成人精品二区三区99精品| 日韩精品午夜视频| 日韩一级片在线观看| 日韩黄色小视频| 日韩女优av电影| 国产在线精品不卡| 国产欧美日韩精品一区| 大白屁股一区二区视频| 国产女同性恋一区二区| 95精品视频在线| 亚洲一区二区视频| 这里只有精品免费| 久久精品国产精品亚洲精品| 精品国产一区二区三区四区四| 精品系列免费在线观看| 久久精品亚洲一区二区三区浴池| 国产剧情一区二区| **网站欧美大片在线观看| 91国偷自产一区二区开放时间| 亚洲一区二区三区国产| 在线电影国产精品| 国产专区综合网| 国产精品的网站| 欧美久久婷婷综合色| 精品一区精品二区高清| 国产欧美精品一区| 欧美天天综合网| 久久激情五月激情| 亚洲天堂免费看| 欧美日韩高清一区二区| 精品一区二区三区不卡| 国产精品久久久久久久久搜平片| 91久久久免费一区二区| 免播放器亚洲一区| 中文字幕永久在线不卡| 在线不卡免费欧美| 国产成人av电影在线播放| 一个色综合av|