亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? marq.m

?? 類神經(jīng)網(wǎng)路─MATLAB的應(yīng)用(范例程式)
?? M
字號:
function [W1,W2,PI_vector,iteration,lambda]=marq(NetDef,W1,W2,PHI,Y,trparms)
%  MARQ
%  ----
%              Train a two layer neural network with the Levenberg-Marquardt
%              method.
%
%          If desired, it is possible to use regularization by
%          weight decay. Also pruned (ie. not fully connected) networks can
%          be trained.
%
%          Given a set of corresponding input-output pairs and an initial
%          network,
%          [W1,W2,critvec,iteration,lambda]=marq(NetDef,W1,W2,PHI,Y,trparms)
%          trains the network with the Levenberg-Marquardt method.
% 
%          The activation functions can be either linear or tanh. The
%          network architecture is defined by the matrix 'NetDef' which
%          has two rows. The first row specifies the hidden layer and the
%          second row specifies the output layer.
% 
%          E.g.:    NetDef = ['LHHHH' 
%                             'LL---']
%
%          (L = Linear, H = tanh)
%
%          A weight is pruned by setting it to zero.
%
%          The Marquardt method is described in:
%          K. Madsen: 'Optimering' (Haefte 38), IMM, DTU, 1991
%  
%          Notice that the bias is included as the last column in the weight
%          matrices.
%
% 
%  INPUT:
%  NetDef: Network definition 
%  W1    : Input-to-hidden layer weights. The matrix dimension is
%          dim(W1) = [(# of hidden units) * (inputs + 1)] (the 1 is due to the bias)
%  W2    : hidden-to-output layer weights.
%           dim(W2) = [(outputs)  *  (# of hidden units + 1)]
%  PHI   : Input vector. dim(PHI) = [(inputs)  *  (# of data)]
%  Y     : Output data. dim(Y) = [(outputs)  * (# of data)]
%  trparms : Vector containing parameters associated with the training
%            trparms = [max_iter stop_crit lambda D]
%             max_iter  : max # of iterations.
%             stop_crit : Stop training if criterion is below this value
%             lambda    : Initial Levenberg-Marquardt parameter
%             D         : Row vector containing the weight decay parameters.
%                            If D has one element a scalar weight decay will be
%                            used. If D has two elements the first element will
%                            be used as weight decay for the hidden-to-output
%                            layer while second will be used for the input-to
%                            hidden layer weights. For individual weight decays,
%                            D must contain as many elements as there are
%                            weights in the network.
%
%           Default values are (obtained if left out): trparms = [500 0 1 0] 
% 
% 
%  OUTPUT:
%  W1, W2   : Weight matrices after training
%  critvec:   Vector containing the criterion evaluated at each iteration
%  iteration: # of iterations
%  lambda   : The final value of lambda. Relevant only if retraining is desired
% 
%  Programmed by : Magnus Norgaard, IAU/IMM
%  LastEditDate  : July 16, 1996


%----------------------------------------------------------------------------------
%--------------             NETWORK INITIALIZATIONS                   -------------
%----------------------------------------------------------------------------------
[outputs,N] = size(Y);                  % # of outputs and # of data
[hidden,inputs] = size(W1);             % # of hidden units 
inputs=inputs-1;                        % # of inputs
L_hidden = find(NetDef(1,:)=='L')';     % Location of linear hidden neurons
H_hidden = find(NetDef(1,:)=='H')';     % Location of tanh hidden neurons
L_output = find(NetDef(2,:)=='L')';     % Location of linear output neurons
H_output = find(NetDef(2,:)=='H')';     % Location of tanh output neurons
y1       = [zeros(hidden,N);ones(1,N)]; % Hidden layer outputs
y2       = zeros(outputs,N);            % Network output
index = outputs*(hidden+1) + 1 + [0:hidden-1]*(inputs+1); % A useful vector!
index2 = (0:N-1)*outputs;               % Yet another useful vector
iteration = 1;                          % Counter variable
dw       = 1;                           % Flag telling that the weights are new
PHI      = [PHI;ones(1,N)];             % Augment PHI with a row containg ones
parameters1= hidden*(inputs+1);         % # of input-to-hidden weights
parameters2= outputs*(hidden+1);        % # of hidden-to-output weights
parameters = parameters1 + parameters2; % Total # of weights
PSI      = zeros(parameters,outputs*N); % Deriv. of each output w.r.t. each weight
ones_h   = ones(hidden+1,1);            % A vector of ones
ones_i   = ones(inputs+1,1);            % Another vector of ones
                                        % Parameter vector containing all weights
theta = [reshape(W2',parameters2,1) ; reshape(W1',parameters1,1)];
theta_index = find(theta);              % Index to weights<>0
theta_red = theta(theta_index);         % Reduced parameter vector
reduced  = length(theta_index);         % The # of parameters in theta_red
index3   = 1:(reduced+1):(reduced^2);   % A third useful vector
if ~exist('trparms')                    % Default training parameters
  max_iter  = 500;
  stop_crit = 0;
  lambda    = 1;
  D         = 0;
else                                    % User specified values
  max_iter  = trparms(1);
  stop_crit = trparms(2);
  lambda    = trparms(3);
  if length(trparms)==4,                % Scalar weight decay parameter
    D = trparms(4*ones(1,reduced))';      
  elseif length(trparms)==5,            % Two weight decay parameters
    D = trparms([4*ones(1,parameters2) 5*ones(1,parameters1)])';
    D = D(theta_index);
  elseif length(trparms)>5,             % Individual weight decay
    D = trparms(4:length(trparms))';
  end
end
PI_vector = zeros(max_iter,1);          % A vector containing the accumulated SSE


%----------------------------------------------------------------------------------
%--------------                   TRAIN NETWORK                       -------------
%----------------------------------------------------------------------------------
clc;
c=fix(clock);
fprintf('Network training started at %2i.%2i.%2i\n\n',c(4),c(5),c(6));


% >>>>>>>>>>>>>>>>>>>>>  COMPUTE NETWORK OUTPUT  y2(theta)   <<<<<<<<<<<<<<<<<<<<<<
h1 = W1*PHI;  
y1(H_hidden,:) = pmntanh(h1(H_hidden,:));
y1(L_hidden,:) = h1(L_hidden,:);    

h2 = W2*y1;
y2(H_output,:) = pmntanh(h2(H_output,:));
y2(L_output,:) = h2(L_output,:);

E        = Y - y2;                      % Training error
E_vector = E(:);                        % Reshape E into a long vector
SSE      = E_vector'*E_vector;          % Sum of squared errors (SSE)
PI       = (SSE+theta_red'*(D.*theta_red))/(2*N); % Performance index

while iteration<=max_iter    
if dw==1,
% >>>>>>>>>>>>>>>>>>>>>>>>>>>   COMPUTE THE PSI MATRIX   <<<<<<<<<<<<<<<<<<<<<<<<<<
% (The derivative of each network output (y2) with respect to each weight)

    % ==========   Elements corresponding to the linear output units   ============
    for i = L_output'
      index1 = (i-1) * (hidden + 1) + 1;

      % -- The part of PSI corresponding to hidden-to-output layer weights --
      PSI(index1:index1+hidden,index2+i) = y1;
      % ---------------------------------------------------------------------
 
      % -- The part of PSI corresponding to input-to-hidden layer weights ---
      for j = L_hidden',
        PSI(index(j):index(j)+inputs,index2+i) = W2(i,j)*PHI;
      end

      for j = H_hidden',
        tmp = W2(i,j)*(1-y1(j,:).*y1(j,:)); 
        PSI(index(j):index(j)+inputs,index2+i) = tmp(ones_i,:).*PHI;
      end 
      % ---------------------------------------------------------------------    
    end

    % ============  Elements corresponding to the tanh output units   =============
    for i = H_output',
      index1 = (i-1) * (hidden + 1) + 1;

      % -- The part of PSI corresponding to hidden-to-output layer weights --
      tmp = 1 - y2(i,:).*y2(i,:);
      PSI(index1:index1+hidden,index2+i) = y1.*tmp(ones_h,:);
      % ---------------------------------------------------------------------
     
      % -- The part of PSI corresponding to input-to-hidden layer weights ---
      for j = L_hidden',
        tmp = W2(i,j)*(1-y2(i,:).*y2(i,:));
        PSI(index(j):index(j)+inputs,index2+i) = tmp(ones_i,:).*PHI;
      end
      
      for j = H_hidden',
        tmp  = W2(i,j)*(1-y1(j,:).*y1(j,:));
        tmp2 = (1-y2(i,:).*y2(i,:));
        PSI(index(j):index(j)+inputs,index2+i) = tmp(ones_i,:)...
                                                  .*tmp2(ones_i,:).*PHI;
      end
      % ---------------------------------------------------------------------
    end
    PSI_red = PSI(theta_index,:);
    
    % -- Gradient --
    G = PSI_red*E_vector-D.*theta_red;

    % -- Means square error part Hessian  --
    R = PSI_red*PSI_red';
    
    dw = 0;
  end
  
   
% >>>>>>>>>>>>>>>>>>>>>>>>>>>        COMPUTE h_k        <<<<<<<<<<<<<<<<<<<<<<<<<<<
  % -- Hessian  --
  H = R;
  H(index3) = H(index3)'+lambda+D;                  % Add diagonal matrix

  % -- Search direction --
  h = H\G;                                          % Solve for search direction

  % -- Compute 'apriori' iterate --
  theta_red_new = theta_red + h;                    % Update parameter vector
  theta(theta_index) = theta_red_new;

  % -- Put the parameters back into the weight matrices --
  W1_new = reshape(theta(parameters2+1:parameters),inputs+1,hidden)';
  W2_new = reshape(theta(1:parameters2),hidden+1,outputs)';

    
% >>>>>>>>>>>>>>>>>>>>   COMPUTE NETWORK OUTPUT  y2(theta+h)   <<<<<<<<<<<<<<<<<<<<
  h1 = W1_new*PHI;  
  y1(H_hidden,:) = pmntanh(h1(H_hidden,:));
  y1(L_hidden,:) = h1(L_hidden,:);
    
  h2 = W2_new*y1;
  y2(H_output,:) = pmntanh(h2(H_output,:));
  y2(L_output,:) = h2(L_output,:);

  E_new        = Y - y2;                 % Training error
  E_new_vector = E_new(:);               % Reshape E into a long vector
  SSE_new  = E_new_vector'*E_new_vector; % Sum of squared errors (SSE)
  PI_new   = (SSE_new + theta_red_new'*(D.*theta_red_new))/(2*N); % PI


% >>>>>>>>>>>>>>>>>>>>>>>>>>>       UPDATE  lambda     <<<<<<<<<<<<<<<<<<<<<<<<<<<<
  L = h'*G + h'*(h.*(D+lambda));

  % Decrease lambda if SSE has fallen 'sufficiently'
  if 2*N*(PI - PI_new) > (0.75*L),
    lambda = lambda/2;
  
  % Increase lambda if SSE has grown 'sufficiently'
  elseif 2*N*(PI-PI_new) <= (0.25*L),
    lambda = 2*lambda;
  end


% >>>>>>>>>>>>>>>>>>>>       UPDATES FOR NEXT ITERATION        <<<<<<<<<<<<<<<<<<<<
  % Update only if criterion has decreased
  if PI_new < PI,                      
    W1 = W1_new;
    W2 = W2_new;
    theta_red = theta_red_new;
    E_vector = E_new_vector;
    PI = PI_new;
    dw = 1;
    iteration = iteration + 1;
    PI_vector(iteration-1) = PI;                             % Collect PI in vector
    fprintf('iteration # %i   PI = %4.3e\r',iteration-1,PI); % Print on-line inform
  end

  % Check if stop condition is fulfilled
  if (PI < stop_crit) | (lambda>1e7), break, end             
end
%----------------------------------------------------------------------------------
%--------------              END OF NETWORK TRAINING                  -------------
%----------------------------------------------------------------------------------
PI_vector = PI_vector(1:iteration-1);
c=fix(clock);
fprintf('\n\nNetwork training ended at %2i.%2i.%2i\n',c(4),c(5),c(6));

?? 快捷鍵說明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
亚洲国产精品av| 亚洲一区二区欧美| 欧美大度的电影原声| 欧美日韩国产综合久久| 日本久久一区二区三区| 91黄色免费版| 欧美羞羞免费网站| 欧美日韩精品免费| 666欧美在线视频| 欧美一区二区三区四区久久| 91精品国产综合久久精品app| 欧美精品 国产精品| 91精品在线麻豆| 精品动漫一区二区三区在线观看 | 久久久久9999亚洲精品| 2023国产精品| 日本一区免费视频| 日韩一区欧美一区| 夜夜嗨av一区二区三区| 日韩精品亚洲一区| 韩国成人精品a∨在线观看| 国产综合久久久久影院| 成人免费av资源| 一本大道综合伊人精品热热| 欧美三级日本三级少妇99| 欧美高清性hdvideosex| 日韩精品中午字幕| 国产精品你懂的| 亚洲综合精品自拍| 男人操女人的视频在线观看欧美| 免费成人av资源网| 国产传媒欧美日韩成人| 972aa.com艺术欧美| 欧美日韩一本到| 精品国产一二三区| 国产精品高潮久久久久无| 亚洲影视在线播放| 九九视频精品免费| aaa亚洲精品一二三区| 欧美三级电影网站| 久久麻豆一区二区| 夜夜亚洲天天久久| 精品在线播放免费| 91视频国产观看| 日韩三级在线观看| 国产精品无遮挡| 午夜久久久影院| 国产福利精品一区| 欧美色网站导航| 久久久影视传媒| 亚洲午夜在线观看视频在线| 紧缚奴在线一区二区三区| 91同城在线观看| 日韩精品一区二区三区四区视频| 国产精品久久久久影院色老大| 午夜久久久久久| 国产高清在线观看免费不卡| 欧美性猛片aaaaaaa做受| 国产婷婷一区二区| 午夜精品福利一区二区三区蜜桃| 国产.欧美.日韩| 欧美美女一区二区| 亚洲欧洲99久久| 蜜臀a∨国产成人精品| 91精品欧美福利在线观看| 国产精品欧美极品| 美女看a上一区| 在线观看国产日韩| 日本一区二区免费在线观看视频 | 国产在线观看一区二区| 日本韩国欧美国产| 国产亚洲欧洲997久久综合| 亚洲超碰精品一区二区| 成人av电影在线网| 欧美电影免费观看高清完整版在线 | 欧美特级限制片免费在线观看| 久久九九久久九九| 免费三级欧美电影| 欧美午夜在线观看| 亚洲欧洲色图综合| 国产91清纯白嫩初高中在线观看| 欧美一区二区三区的| 一区二区三区日韩精品| 成人app在线| 国产欧美日韩三区| 国内精品免费**视频| 欧美电影影音先锋| 亚洲一区二区欧美日韩| 色中色一区二区| 成人免费小视频| 成人短视频下载| 国产精品情趣视频| 国产999精品久久久久久绿帽| 欧美一级日韩免费不卡| 日韩在线a电影| 欧美视频一区在线| 亚洲一区在线播放| 在线观看国产91| 亚洲宅男天堂在线观看无病毒| 99国产精品国产精品久久| 国产精品超碰97尤物18| 成人免费毛片嘿嘿连载视频| 国产亚洲欧美在线| 国产福利一区二区三区视频 | 日本美女一区二区三区| 欧美猛男男办公室激情| 亚洲成人一区在线| 欧美日本一道本| 热久久免费视频| 欧美xxxxx裸体时装秀| 精品午夜久久福利影院| 26uuu另类欧美| 懂色一区二区三区免费观看| 中文字幕精品一区| youjizz国产精品| 亚洲卡通欧美制服中文| 在线精品亚洲一区二区不卡| 亚洲图片欧美一区| 91精品国产综合久久久久久久久久| 日韩二区三区四区| 精品入口麻豆88视频| 国产成人午夜精品影院观看视频 | 亚洲欧美在线另类| 91国在线观看| 青青草国产精品亚洲专区无| 久久久综合精品| 99久久精品国产观看| 亚洲精品亚洲人成人网在线播放| 在线观看欧美精品| 老司机午夜精品| 国产精品亲子伦对白| 欧美亚洲日本一区| 麻豆精品在线看| 国产精品色在线| 欧美午夜精品一区| 精品综合免费视频观看| 国产精品乱码一区二区三区软件 | 色琪琪一区二区三区亚洲区| 亚洲国产中文字幕在线视频综合 | 五月天激情综合| 久久综合色婷婷| 91免费看视频| 蜜臀av性久久久久av蜜臀妖精| 欧美国产精品一区二区| 在线免费亚洲电影| 黄网站免费久久| 亚洲精品少妇30p| 精品久久久久久久久久久久包黑料| 成人一级视频在线观看| 亚洲午夜一区二区| 国产亚洲精品资源在线26u| 色香蕉成人二区免费| 青青草原综合久久大伊人精品优势| 久久先锋影音av鲁色资源| 在线观看一区日韩| 精品在线一区二区| 亚洲一区二区三区美女| 久久午夜羞羞影院免费观看| 在线观看免费成人| 国产91综合网| 日韩av电影免费观看高清完整版 | 国产91精品在线观看| 午夜欧美大尺度福利影院在线看 | 精品一区精品二区高清| 尤物av一区二区| 久久精品日韩一区二区三区| 欧美色视频在线| 成人动漫视频在线| 美女在线视频一区| 一区二区三区四区不卡在线| 国产喷白浆一区二区三区| 51精品视频一区二区三区| eeuss鲁一区二区三区| 国内成+人亚洲+欧美+综合在线 | 成人精品鲁一区一区二区| 日韩精品1区2区3区| 亚洲日本电影在线| 欧美激情自拍偷拍| 日韩午夜精品电影| 欧美日韩在线播放三区四区| 成人黄色a**站在线观看| 久久狠狠亚洲综合| 亚洲成av人片在www色猫咪| 亚洲男帅同性gay1069| 国产欧美一区二区三区沐欲| 欧美大白屁股肥臀xxxxxx| 欧美日韩一级片在线观看| 色偷偷88欧美精品久久久| www.成人在线| 国产sm精品调教视频网站| 狠狠色综合色综合网络| 日本aⅴ免费视频一区二区三区| 亚洲小说欧美激情另类| 亚洲黄色免费电影| 亚洲视频在线一区观看| 国产精品亲子伦对白| 欧美国产精品专区| 欧美国产禁国产网站cc| 国产亚洲污的网站| 国产女人aaa级久久久级|