亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? rpe.m

?? matlab實現神經網絡程序集合
?? M
字號:
function [W1,W2,PI_vector,iter]=rpe(NetDef,W1,W2,PHI,Y,trparms,method)
%  RPE
%  --- 
%          Train a two layer neural network with a recursive prediction error
%          algorithm ("recursive Gauss-Newton"). Also pruned (i.e., not fully
%          connected) networks can be trained.
%
%          The activation functions can either be linear or tanh. The network
%          architecture is defined by the matrix 'NetDef', which has of two
%          rows. The first row specifies the hidden layer while the second
%          specifies the output layer.
%
%          E.g.:    NetDef = ['LHHHH'
%                             'LL---']
% 
%          (L = linear, H = tanh)
%
%          A weight is pruned by setting it to zero.
%
%          The algorithm is described in:
%          L. Ljung: "System Identification - Theory for the User"
%          (Prentice-Hall, 1987)
%
%          Notice that the bias is included as the last column
%          in the weight matrices.
%
%  CALL:
%            [W1,W2,critvec,iter]=rpe(NetDef,W1,W2,PHI,Y,trparms,method)
%
%  INPUT:
%  NetDef: Network definition 
%  W1    : Input-to-hidden layer weights. The matrix dimension is
%          dim(W1) = [(# of hidden units) * (inputs + 1)] (the 1 is due to the bias)
%  W2    : hidden-to-output layer weights.
%           dim(W2) = [(outputs)  *  (# of hidden units + 1)]
%  PHI   : Input vector. dim(PHI) = [(inputs)  *  (# of data)]
%  Y     : Output data. dim(Y) = [(outputs)  * (# of data)]
%  trparms : Contains parameters associated with the training
%  method  : Training (=estimation) method (ff, ct, efra)
%     method = 'ff' (forgetting factor)
%                   trparms = [max_iter stop_crit p0 lambda]
%     method = 'ct' (constant trace)
%                   trparms = [max_iter stop_crit alpha_max alpha_min]
%     method = 'efra' (exponential forgetting and resetting algorithm)
%                   trparms = [max_iter stop_crit alpha beta delta lambda]
%       max_iter       : max # of iterations.
%       stop_crit      : Stop training if criterion gets below this value
%       p0             : The covariance matrix is initialized to p0*I 
%       lambda         : Forgetting factor
%       alpha_max      : Max. eigenvalue of P matrix
%       alpha_min      : Min. eigenvalue of P matrix
%       alpha, beta, delta: EFRA parameters
% 
%
%  OUTPUT:
%  W1, W2    : Weight matrices after training
%  critvec   : Vector containing the criterion of fit after each iteration
%  iter      : # of iterations
%  
%  Programmed by : Magnus Norgaard, IAU/IMM, Technical University of Denmark 
%  LastEditDate  : July 17, 1996


%----------------------------------------------------------------------------------
%--------------             NETWORK INITIALIZATIONS                   -------------
%----------------------------------------------------------------------------------
max_iter = trparms(1);
stop_crit= trparms(2);
[outputs,N] = size(Y);                  % # of outputs and # of data
[hidden,inputs] = size(W1);             % # of hidden units 
inputs   =inputs-1;                     % # of inputs
L_hidden = find(NetDef(1,:)=='L')';     % Location of linear hidden neurons
H_hidden = find(NetDef(1,:)=='H')';     % Location of tanh hidden neurons
L_output = find(NetDef(2,:)=='L')';     % Location of linear output neurons
H_output = find(NetDef(2,:)=='H')';     % Location of tanh output neurons
PI_vector= [];                          % Vector containing the accumulated SSE
y1       = zeros(hidden,1);             % Hidden layer outputs
y2       = zeros(outputs,1);            % Network output
index = outputs*(hidden+1) + 1 + [0:hidden-1]*(inputs+1); % A useful vector!
PHI_aug  = [PHI;ones(1,N)];             % Augment PHI with a row containg ones
parameters1= hidden*(inputs+1);         % # of input-to-hidden weights
parameters2= outputs*(hidden+1);        % # of hidden-to-output weights
parameters = parameters1 + parameters2; % Total # of weights
PSI        = zeros(parameters,outputs); % Deriv. of each output w.r.t. each weight
                                        % Parametervector containing all weights
theta = [reshape(W2',parameters2,1) ; reshape(W1',parameters1,1)];
theta_index = find(theta);              % Index to weights<>0
theta_red = theta(theta_index);         % Reduced parameter vector
reduced  = length(theta_index);         % The # of parameters in theta_red
index3= 1:(reduced+1):(reduced^2);      % Yet another useful vector
if strcmp(method,'ff'),                 % Forgetting factor method
  mflag     = 1;                        % Method flag
  lambda    = trparms(4);               % Forgetting factor
  p0        = trparms(3);
  P         = p0 * eye(reduced);        % Initialize covariance matrix
elseif strcmp(method,'ct'),             % Constant trace method
  mflag     = 2;                        % Method flag
  alpha_max = trparms(3);               % Max. eigenvalue
  alpha_min = trparms(4);               % Min. eigenvalue
  P      = alpha_max * eye(reduced);    % Initialize covariance matrix
elseif strcmp(method,'efra'),           % EFRA method
  mflag     = 3;                        % Method flag
  alpha     = trparms(3);               % EFRA parameters
  beta      = trparms(4);
  delta     = trparms(5);
  lambda    = trparms(6);
  gamma     = (1-lambda)/lambda;
                                        % Max. eigenvalue
  maxeig = gamma/(2*delta)*(1+sqrt(1+4*beta*delta/(gamma*gamma)));
  P      = maxeig * eye(reduced);       % Initialize covariance matrix
  betaI     = beta*eye(reduced);        % Useful diagonal matrix
end
I        = eye(outputs);                % (outputs|outputs) unity matrix
lambdaI  = lambda*I;                    % Diagonal matrix


%----------------------------------------------------------------------------------
%-------------                    TRAIN NETWORK                       -------------
%----------------------------------------------------------------------------------
clc;
c=fix(clock);
fprintf('Network training started at %2i.%2i.%2i\n\n',c(4),c(5),c(6));

for iteration=1:max_iter,
  SSE=0;
  for t=1:N,

% >>>>>>>>>>>>>>>>>>>>>>>>  COMPUTE NETWORK OUTPUT y2(theta) <<<<<<<<<<<<<<<<<<<<<<
    h1 = W1(:,1:inputs)*PHI(:,t) + W1(:,inputs+1);  
    y1(H_hidden) = pmntanh(h1(H_hidden)); 
    y1(L_hidden) = h1(L_hidden);
    
    h2 = W2(:,1:hidden)*y1 + W2(:,hidden+1);
    y2(H_output) = pmntanh(h2(H_output));
    y2(L_output) = h2(L_output);

    y1_aug=[y1;1];
    E = Y(:,t) - y2;                      % Training error


%>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  COMPUTE THE PSI MATRIX  <<<<<<<<<<<<<<<<<<<<<<<<<<<
% (The derivative of each y2(t) with respect to each weight)

    % ==========   Elements corresponding to the linear output units   ============
    for i = L_output'

      % -- The part of PSI corresponding to hidden-to-output layer weights --
      index1 = (i-1) * (hidden + 1) + 1;
      PSI(index1:index1+hidden,i) = y1_aug;
      % ---------------------------------------------------------------------
 
      % -- The part of PSI corresponding to input-to-hidden layer weights ---
      for j = L_hidden',
        PSI(index(j):index(j)+inputs,i) = W2(i,j)*PHI_aug(:,t);
      end
      
      for j = H_hidden',
        PSI(index(j):index(j)+inputs,i) = W2(i,j)*(1-y1(j)*y1(j))*PHI_aug(:,t);
      end 
      % ---------------------------------------------------------------------    
    end

    % ============  Elements corresponding to the tanh output units   =============
    for i = H_output',
      % -- The part of PSI corresponding to hidden-to-output layer weights --
      index1 = (i-1) * (hidden + 1) + 1;
      PSI(index1:index1+hidden,i) = y1_aug * (1 - y2(i)*y2(i));
      % ---------------------------------------------------------------------
       
      % -- The part of PSI corresponding to input-to-hidden layer weights ---
      for j = L_hidden',
        PSI(index(j):index(j)+inputs,i) = W2(i,j)*(1-y2(i)*y2(i))...
                                                              * PHI_aug(:,t);
      end
      
      for j = H_hidden',
        PSI(index(j):index(j)+inputs,i) = W2(i,j)*(1-y1(j)*y1(j))...
                                             *(1-y2(i)*y2(i)) * PHI_aug(:,t);
      end
      % ---------------------------------------------------------------------
    end



%>>>>>>>>>>>>>>>>>>>>>>>>>>>>>    UPDATE THE WEIGHTS    <<<<<<<<<<<<<<<<<<<<<<<<<<<
    PSI_red = PSI(theta_index);
    
    % ---------- Forgetting factor method ----------
    if mflag == 1,
      % -- Update P matrix --
      P = (P - P*PSI_red*inv(lambdaI + PSI_red'*P*PSI_red)*PSI_red'*P ) / lambda;

      % -- Update Parameters --
      theta_red = theta_red + P*PSI_red*E;
      
    % ----------  Constant trace method   ---------- 
    elseif mflag == 2,
      % -- Measurement update of P matrix --
      P = (P - P*PSI_red * inv(I + PSI_red'*P*PSI_red) * PSI_red'*P );

      % -- Update Parameters --
      theta_red = theta_red + P*PSI_red*E;

      % -- Time update of P matrix --
      P         = ((alpha_max-alpha_min)/trace(P))*P;
      P(index3) = P(index3)+alpha_min;
      
    % ----------       EFRA method        ---------- 
    else 
      % -- Correction factor --
      K = P*PSI_red * (alpha*inv(I + PSI_red'*P*PSI_red));

      % -- Update Parameters --
      theta_red = theta_red + K*E;
      
      % -- Update P --
      P = P/lambda - K*PSI_red'*P + betaI-delta*P*P;
    end
    theta(theta_index) = theta_red;       % Put estimated weights back into theta


    % -- Put the parameters back into the weight matrices --
    W1 = reshape(theta(parameters2+1:parameters),inputs+1,hidden)';
    W2 = reshape(theta(1:parameters2),hidden+1,outputs)';

    % -- Accumulate SSE --
    SSE = SSE + E'*E;
  end
  
  
%>>>>>>>>>>>>>>>>>>>>>>       UPDATES FOR NEXT ITERATION       <<<<<<<<<<<<<<<<<<<<
  PI = SSE/(2*N);
  PI_vector(iteration) = PI;            % Collect PI

  fprintf('iteration # %i   PI = %4.3e\r',iteration,PI); % Print on-line inform.
  if PI < stop_crit, break, end         % Check if stop condition is fulfilled
end


%----------------------------------------------------------------------------------
%-------------              END OF NETWORK TRAINING                  --------------
%----------------------------------------------------------------------------------
c=fix(clock);
fprintf('\n\nNetwork training ended at %2i.%2i.%2i\n',c(4),c(5),c(6));

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
亚洲高清免费视频| 一本久久精品一区二区| 成人精品免费看| 色综合婷婷久久| 欧美乱妇15p| 久久夜色精品一区| 综合久久久久久久| 亚洲成人777| 国产一区二区精品久久99| av在线不卡免费看| 欧美一区国产二区| 中文字幕精品一区二区三区精品| 亚洲国产欧美在线人成| 国产乱子伦视频一区二区三区 | 中文字幕一区在线| 亚洲成a人片在线不卡一二三区| 精品一区二区三区在线播放| 99久久精品费精品国产一区二区| 欧美一级视频精品观看| 亚洲欧洲三级电影| 九九九精品视频| 91日韩精品一区| 日韩美女一区二区三区四区| 亚洲猫色日本管| 国产一区二三区| 欧美性一区二区| 国产视频一区二区在线| 午夜精品在线视频一区| 成人激情综合网站| 欧美一区二区视频观看视频| 亚洲欧美在线观看| 国内外精品视频| 欧美日韩国产另类一区| 中文字幕一区二区三| 久久精品国产色蜜蜜麻豆| 在线这里只有精品| 中文字幕欧美国产| 免费观看成人鲁鲁鲁鲁鲁视频| 91视频com| 国产调教视频一区| 久久不见久久见免费视频1| 欧美日韩一区在线观看| 综合久久综合久久| 成人黄色小视频在线观看| 日韩欧美成人午夜| 日韩精品亚洲专区| 91久久精品国产91性色tv| 欧美国产日韩一二三区| 精品一区二区三区在线播放视频 | 蜜桃视频在线一区| 欧美三级在线播放| 一区二区在线观看免费| 丁香亚洲综合激情啪啪综合| 欧美精品一区二区三区一线天视频 | 毛片基地黄久久久久久天堂| 欧美综合亚洲图片综合区| 最新日韩av在线| 国产iv一区二区三区| 欧美成人女星排行榜| 蜜桃精品视频在线| 欧美精品亚洲二区| 亚洲高清免费一级二级三级| 久久国产精品72免费观看| 在线免费观看视频一区| 亚洲特级片在线| 老司机免费视频一区二区三区| 久久国产乱子精品免费女| 中文字幕乱码久久午夜不卡| 日韩欧美一级二级三级 | 欧美午夜精品久久久久久超碰| 国产精品成人一区二区艾草 | ㊣最新国产の精品bt伙计久久| 成人黄色小视频| 国产精品乱码人人做人人爱| 成人午夜在线播放| 久久久亚洲午夜电影| 国产乱妇无码大片在线观看| 久久精品一区二区| 国产精品99久| 国产精品毛片久久久久久| 91丨porny丨国产入口| 亚洲欧美一区二区不卡| 色婷婷av一区二区三区软件| 亚洲电影一区二区三区| 欧美一区二区三区在线电影| 奇米精品一区二区三区四区| 日韩免费在线观看| 国产福利不卡视频| 1024精品合集| 欧美日韩视频在线第一区| 日韩高清一区在线| 欧美哺乳videos| 国产成+人+日韩+欧美+亚洲| 亚洲丝袜制服诱惑| 欧美性受xxxx| 久久精品99久久久| 亚洲国产成人私人影院tom| 91丨porny丨国产| 日日夜夜精品视频免费| 日韩欧美激情在线| 国产91丝袜在线播放0| 综合激情网...| 欧美日韩高清在线| 久草精品在线观看| 综合色天天鬼久久鬼色| 欧美精品丝袜中出| 国产麻豆9l精品三级站| 最新热久久免费视频| 欧美久久免费观看| 国产精品亚洲视频| 亚洲午夜在线视频| 日韩欧美www| 91视视频在线观看入口直接观看www | 九色porny丨国产精品| 国产精品卡一卡二卡三| 欧美日韩一区小说| 国产精一区二区三区| 亚洲六月丁香色婷婷综合久久 | 国产精品99久| 亚洲国产日韩综合久久精品| 精品国产麻豆免费人成网站| 99精品偷自拍| 日本午夜一本久久久综合| 国产调教视频一区| 欧美麻豆精品久久久久久| 国产精品一区免费在线观看| 亚洲综合久久久| 久久九九全国免费| 日本道精品一区二区三区 | 中文字幕一区二区三区不卡| 91精品国产乱| 99久久精品99国产精品| 麻豆精品一区二区综合av| 亚洲三级在线免费观看| 日韩免费电影一区| 在线观看成人小视频| 国产成人高清视频| 日本午夜一本久久久综合| 亚洲免费视频成人| 国产网红主播福利一区二区| 8x8x8国产精品| 91丝袜美腿高跟国产极品老师| 国模冰冰炮一区二区| 亚洲超丰满肉感bbw| 国产精品日产欧美久久久久| 日韩免费成人网| 欧美日韩色一区| 99re热这里只有精品视频| 久久国产欧美日韩精品| 午夜av一区二区三区| 1区2区3区国产精品| 久久久久久一级片| 8v天堂国产在线一区二区| 91美女福利视频| 国产盗摄女厕一区二区三区| 美国十次了思思久久精品导航| 亚洲一区二区三区在线播放| 日本一区二区三区四区在线视频 | 激情偷乱视频一区二区三区| 午夜久久久久久久久久一区二区| 亚洲人成精品久久久久久| 久久精品人人做人人综合 | 奇米亚洲午夜久久精品| 亚洲成人资源网| 亚洲最大成人综合| 自拍av一区二区三区| 国产免费观看久久| 国产亚洲欧美中文| 精品999在线播放| 欧美xxxxxxxxx| 欧美一区二区在线看| 一本一道久久a久久精品综合蜜臀| 国产成人自拍网| 国产成人精品午夜视频免费| 激情综合网最新| 日本美女视频一区二区| 图片区小说区国产精品视频| 91免费在线播放| 精品午夜久久福利影院| 免费一级片91| 中文字幕一区二区三| 国产精品理论片在线观看| 91精品国产麻豆| 在线这里只有精品| 欧美日韩久久一区二区| 在线观看成人免费视频| 欧美视频在线一区二区三区 | 国产精品女同互慰在线看| 国产精品污网站| 在线观看视频欧美| 国产mv日韩mv欧美| 国产精品高潮呻吟| 国产精品国产三级国产aⅴ入口 | 国产精品一区在线| 国产福利一区在线| 99久久精品免费| 91久久精品网| 4438亚洲最大| 久久一日本道色综合| 国产精品亲子伦对白|