亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? rpe.m

?? matlab實現神經網絡程序集合
?? M
字號:
function [W1,W2,PI_vector,iter]=rpe(NetDef,W1,W2,PHI,Y,trparms,method)
%  RPE
%  --- 
%          Train a two layer neural network with a recursive prediction error
%          algorithm ("recursive Gauss-Newton"). Also pruned (i.e., not fully
%          connected) networks can be trained.
%
%          The activation functions can either be linear or tanh. The network
%          architecture is defined by the matrix 'NetDef', which has of two
%          rows. The first row specifies the hidden layer while the second
%          specifies the output layer.
%
%          E.g.:    NetDef = ['LHHHH'
%                             'LL---']
% 
%          (L = linear, H = tanh)
%
%          A weight is pruned by setting it to zero.
%
%          The algorithm is described in:
%          L. Ljung: "System Identification - Theory for the User"
%          (Prentice-Hall, 1987)
%
%          Notice that the bias is included as the last column
%          in the weight matrices.
%
%  CALL:
%            [W1,W2,critvec,iter]=rpe(NetDef,W1,W2,PHI,Y,trparms,method)
%
%  INPUT:
%  NetDef: Network definition 
%  W1    : Input-to-hidden layer weights. The matrix dimension is
%          dim(W1) = [(# of hidden units) * (inputs + 1)] (the 1 is due to the bias)
%  W2    : hidden-to-output layer weights.
%           dim(W2) = [(outputs)  *  (# of hidden units + 1)]
%  PHI   : Input vector. dim(PHI) = [(inputs)  *  (# of data)]
%  Y     : Output data. dim(Y) = [(outputs)  * (# of data)]
%  trparms : Contains parameters associated with the training
%  method  : Training (=estimation) method (ff, ct, efra)
%     method = 'ff' (forgetting factor)
%                   trparms = [max_iter stop_crit p0 lambda]
%     method = 'ct' (constant trace)
%                   trparms = [max_iter stop_crit alpha_max alpha_min]
%     method = 'efra' (exponential forgetting and resetting algorithm)
%                   trparms = [max_iter stop_crit alpha beta delta lambda]
%       max_iter       : max # of iterations.
%       stop_crit      : Stop training if criterion gets below this value
%       p0             : The covariance matrix is initialized to p0*I 
%       lambda         : Forgetting factor
%       alpha_max      : Max. eigenvalue of P matrix
%       alpha_min      : Min. eigenvalue of P matrix
%       alpha, beta, delta: EFRA parameters
% 
%
%  OUTPUT:
%  W1, W2    : Weight matrices after training
%  critvec   : Vector containing the criterion of fit after each iteration
%  iter      : # of iterations
%  
%  Programmed by : Magnus Norgaard, IAU/IMM, Technical University of Denmark 
%  LastEditDate  : July 17, 1996


%----------------------------------------------------------------------------------
%--------------             NETWORK INITIALIZATIONS                   -------------
%----------------------------------------------------------------------------------
max_iter = trparms(1);
stop_crit= trparms(2);
[outputs,N] = size(Y);                  % # of outputs and # of data
[hidden,inputs] = size(W1);             % # of hidden units 
inputs   =inputs-1;                     % # of inputs
L_hidden = find(NetDef(1,:)=='L')';     % Location of linear hidden neurons
H_hidden = find(NetDef(1,:)=='H')';     % Location of tanh hidden neurons
L_output = find(NetDef(2,:)=='L')';     % Location of linear output neurons
H_output = find(NetDef(2,:)=='H')';     % Location of tanh output neurons
PI_vector= [];                          % Vector containing the accumulated SSE
y1       = zeros(hidden,1);             % Hidden layer outputs
y2       = zeros(outputs,1);            % Network output
index = outputs*(hidden+1) + 1 + [0:hidden-1]*(inputs+1); % A useful vector!
PHI_aug  = [PHI;ones(1,N)];             % Augment PHI with a row containg ones
parameters1= hidden*(inputs+1);         % # of input-to-hidden weights
parameters2= outputs*(hidden+1);        % # of hidden-to-output weights
parameters = parameters1 + parameters2; % Total # of weights
PSI        = zeros(parameters,outputs); % Deriv. of each output w.r.t. each weight
                                        % Parametervector containing all weights
theta = [reshape(W2',parameters2,1) ; reshape(W1',parameters1,1)];
theta_index = find(theta);              % Index to weights<>0
theta_red = theta(theta_index);         % Reduced parameter vector
reduced  = length(theta_index);         % The # of parameters in theta_red
index3= 1:(reduced+1):(reduced^2);      % Yet another useful vector
if strcmp(method,'ff'),                 % Forgetting factor method
  mflag     = 1;                        % Method flag
  lambda    = trparms(4);               % Forgetting factor
  p0        = trparms(3);
  P         = p0 * eye(reduced);        % Initialize covariance matrix
elseif strcmp(method,'ct'),             % Constant trace method
  mflag     = 2;                        % Method flag
  alpha_max = trparms(3);               % Max. eigenvalue
  alpha_min = trparms(4);               % Min. eigenvalue
  P      = alpha_max * eye(reduced);    % Initialize covariance matrix
elseif strcmp(method,'efra'),           % EFRA method
  mflag     = 3;                        % Method flag
  alpha     = trparms(3);               % EFRA parameters
  beta      = trparms(4);
  delta     = trparms(5);
  lambda    = trparms(6);
  gamma     = (1-lambda)/lambda;
                                        % Max. eigenvalue
  maxeig = gamma/(2*delta)*(1+sqrt(1+4*beta*delta/(gamma*gamma)));
  P      = maxeig * eye(reduced);       % Initialize covariance matrix
  betaI     = beta*eye(reduced);        % Useful diagonal matrix
end
I        = eye(outputs);                % (outputs|outputs) unity matrix
lambdaI  = lambda*I;                    % Diagonal matrix


%----------------------------------------------------------------------------------
%-------------                    TRAIN NETWORK                       -------------
%----------------------------------------------------------------------------------
clc;
c=fix(clock);
fprintf('Network training started at %2i.%2i.%2i\n\n',c(4),c(5),c(6));

for iteration=1:max_iter,
  SSE=0;
  for t=1:N,

% >>>>>>>>>>>>>>>>>>>>>>>>  COMPUTE NETWORK OUTPUT y2(theta) <<<<<<<<<<<<<<<<<<<<<<
    h1 = W1(:,1:inputs)*PHI(:,t) + W1(:,inputs+1);  
    y1(H_hidden) = pmntanh(h1(H_hidden)); 
    y1(L_hidden) = h1(L_hidden);
    
    h2 = W2(:,1:hidden)*y1 + W2(:,hidden+1);
    y2(H_output) = pmntanh(h2(H_output));
    y2(L_output) = h2(L_output);

    y1_aug=[y1;1];
    E = Y(:,t) - y2;                      % Training error


%>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  COMPUTE THE PSI MATRIX  <<<<<<<<<<<<<<<<<<<<<<<<<<<
% (The derivative of each y2(t) with respect to each weight)

    % ==========   Elements corresponding to the linear output units   ============
    for i = L_output'

      % -- The part of PSI corresponding to hidden-to-output layer weights --
      index1 = (i-1) * (hidden + 1) + 1;
      PSI(index1:index1+hidden,i) = y1_aug;
      % ---------------------------------------------------------------------
 
      % -- The part of PSI corresponding to input-to-hidden layer weights ---
      for j = L_hidden',
        PSI(index(j):index(j)+inputs,i) = W2(i,j)*PHI_aug(:,t);
      end
      
      for j = H_hidden',
        PSI(index(j):index(j)+inputs,i) = W2(i,j)*(1-y1(j)*y1(j))*PHI_aug(:,t);
      end 
      % ---------------------------------------------------------------------    
    end

    % ============  Elements corresponding to the tanh output units   =============
    for i = H_output',
      % -- The part of PSI corresponding to hidden-to-output layer weights --
      index1 = (i-1) * (hidden + 1) + 1;
      PSI(index1:index1+hidden,i) = y1_aug * (1 - y2(i)*y2(i));
      % ---------------------------------------------------------------------
       
      % -- The part of PSI corresponding to input-to-hidden layer weights ---
      for j = L_hidden',
        PSI(index(j):index(j)+inputs,i) = W2(i,j)*(1-y2(i)*y2(i))...
                                                              * PHI_aug(:,t);
      end
      
      for j = H_hidden',
        PSI(index(j):index(j)+inputs,i) = W2(i,j)*(1-y1(j)*y1(j))...
                                             *(1-y2(i)*y2(i)) * PHI_aug(:,t);
      end
      % ---------------------------------------------------------------------
    end



%>>>>>>>>>>>>>>>>>>>>>>>>>>>>>    UPDATE THE WEIGHTS    <<<<<<<<<<<<<<<<<<<<<<<<<<<
    PSI_red = PSI(theta_index);
    
    % ---------- Forgetting factor method ----------
    if mflag == 1,
      % -- Update P matrix --
      P = (P - P*PSI_red*inv(lambdaI + PSI_red'*P*PSI_red)*PSI_red'*P ) / lambda;

      % -- Update Parameters --
      theta_red = theta_red + P*PSI_red*E;
      
    % ----------  Constant trace method   ---------- 
    elseif mflag == 2,
      % -- Measurement update of P matrix --
      P = (P - P*PSI_red * inv(I + PSI_red'*P*PSI_red) * PSI_red'*P );

      % -- Update Parameters --
      theta_red = theta_red + P*PSI_red*E;

      % -- Time update of P matrix --
      P         = ((alpha_max-alpha_min)/trace(P))*P;
      P(index3) = P(index3)+alpha_min;
      
    % ----------       EFRA method        ---------- 
    else 
      % -- Correction factor --
      K = P*PSI_red * (alpha*inv(I + PSI_red'*P*PSI_red));

      % -- Update Parameters --
      theta_red = theta_red + K*E;
      
      % -- Update P --
      P = P/lambda - K*PSI_red'*P + betaI-delta*P*P;
    end
    theta(theta_index) = theta_red;       % Put estimated weights back into theta


    % -- Put the parameters back into the weight matrices --
    W1 = reshape(theta(parameters2+1:parameters),inputs+1,hidden)';
    W2 = reshape(theta(1:parameters2),hidden+1,outputs)';

    % -- Accumulate SSE --
    SSE = SSE + E'*E;
  end
  
  
%>>>>>>>>>>>>>>>>>>>>>>       UPDATES FOR NEXT ITERATION       <<<<<<<<<<<<<<<<<<<<
  PI = SSE/(2*N);
  PI_vector(iteration) = PI;            % Collect PI

  fprintf('iteration # %i   PI = %4.3e\r',iteration,PI); % Print on-line inform.
  if PI < stop_crit, break, end         % Check if stop condition is fulfilled
end


%----------------------------------------------------------------------------------
%-------------              END OF NETWORK TRAINING                  --------------
%----------------------------------------------------------------------------------
c=fix(clock);
fprintf('\n\nNetwork training ended at %2i.%2i.%2i\n',c(4),c(5),c(6));

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
日本大香伊一区二区三区| 在线免费av一区| 国产欧美久久久精品影院| 国产综合色精品一区二区三区| 91精品中文字幕一区二区三区| 亚洲成人免费av| 久久久欧美精品sm网站| av成人动漫在线观看| 五月婷婷激情综合| 精品国产乱码久久| 欧美吻胸吃奶大尺度电影 | 日韩一区二区中文字幕| 精品综合免费视频观看| 亚洲欧洲综合另类在线| 欧美变态凌虐bdsm| 色激情天天射综合网| 国产综合色产在线精品| 亚洲欧美国产毛片在线| 久久久.com| 91精品国产全国免费观看| 欧美一级日韩免费不卡| wwww国产精品欧美| 欧美色综合久久| av激情亚洲男人天堂| 久久精品国产亚洲高清剧情介绍| 日韩毛片在线免费观看| 欧美成人性战久久| 欧美高清精品3d| 精品视频1区2区3区| 97久久精品人人做人人爽| 成人午夜av电影| 国产999精品久久| 丁香六月综合激情| 麻豆91在线观看| 亚洲色图在线看| 精品少妇一区二区| 国产精品99久久久久久久女警| 午夜精品123| 亚洲国产日产av| 丝袜亚洲另类丝袜在线| 午夜私人影院久久久久| 日韩精品色哟哟| 免费在线视频一区| 久久99精品国产麻豆婷婷| 久草中文综合在线| 风间由美中文字幕在线看视频国产欧美| 国产福利91精品| 精品日韩欧美在线| 久久夜色精品国产噜噜av| 久久久精品影视| 亚洲乱码国产乱码精品精98午夜| 亚洲日韩欧美一区二区在线| 亚洲乱码国产乱码精品精小说| 五月婷婷综合在线| 99久久99久久精品免费看蜜桃| 色94色欧美sute亚洲线路一ni| 欧美一区二区久久| 国产精品国产馆在线真实露脸 | 91丨九色丨尤物| 欧美群妇大交群的观看方式| 国产视频一区在线播放| 亚洲一区二区在线观看视频| 国产在线视视频有精品| 欧美网站一区二区| 亚洲同性同志一二三专区| 国产成人免费av在线| 欧美蜜桃一区二区三区| 亚洲男人天堂av| jvid福利写真一区二区三区| 国产清纯美女被跳蛋高潮一区二区久久w | 在线这里只有精品| 91国偷自产一区二区三区成为亚洲经典| 日韩一区二区免费在线观看| 亚洲黄色小说网站| 一本色道久久加勒比精品| 中文一区二区完整视频在线观看| 极品美女销魂一区二区三区 | 狠狠色丁香九九婷婷综合五月| 欧美三级欧美一级| 五月天一区二区三区| 欧美人狂配大交3d怪物一区| 麻豆91小视频| 欧美日韩国产电影| 在线亚洲免费视频| 91在线高清观看| 国产精品国产三级国产有无不卡| 成人妖精视频yjsp地址| 中文字幕一区二区三区乱码在线 | 一区二区高清免费观看影视大全 | 亚洲国产精品久久久久秋霞影院| 欧美综合在线视频| 久久国产精品72免费观看| 91精品国产aⅴ一区二区| 亚洲久本草在线中文字幕| 成人晚上爱看视频| 午夜精品福利一区二区蜜股av| 日韩一级在线观看| 91视视频在线观看入口直接观看www | 欧美成人免费网站| 91美女福利视频| 久久99久久精品| 亚洲午夜久久久久久久久电影网 | 欧美日韩成人一区| 国产成人啪免费观看软件| 午夜成人在线视频| 1区2区3区国产精品| 国产日韩av一区| 精品人伦一区二区色婷婷| 欧洲视频一区二区| 91在线国内视频| 91亚洲国产成人精品一区二区三 | 青青草伊人久久| 亚洲午夜电影网| 午夜欧美在线一二页| 亚洲精品一二三四区| 欧美国产精品v| 亚洲国产激情av| 亚洲婷婷在线视频| 亚洲色图欧洲色图| 亚洲资源中文字幕| 夜夜嗨av一区二区三区| 亚洲猫色日本管| 亚洲一级二级三级在线免费观看| 亚洲欧美日韩一区二区 | 欧美国产日韩亚洲一区| 久久久久久久久免费| 欧美激情一区三区| 中文字幕一区二区三区在线观看| 国产清纯美女被跳蛋高潮一区二区久久w| 精品久久人人做人人爱| 国产性天天综合网| 亚洲美女淫视频| 免费观看30秒视频久久| 成人性生交大片免费看在线播放| 成人视屏免费看| 69av一区二区三区| 亚洲国产成人一区二区三区| 亚洲日本欧美天堂| 国产在线观看一区二区 | 欧美三级蜜桃2在线观看| 精品国产免费一区二区三区香蕉| 国产精品欧美一级免费| 亚洲一区二区三区四区五区黄| 久久国产精品99久久人人澡| 成人av免费在线观看| 欧美日韩精品欧美日韩精品| 久久众筹精品私拍模特| 亚洲最快最全在线视频| 成人精品免费视频| 日韩欧美成人一区| 美国毛片一区二区| 欧美人妇做爰xxxⅹ性高电影| 国产精品免费视频观看| 国产麻豆午夜三级精品| 欧美一区二区三区四区久久| 亚洲色图欧洲色图婷婷| 97精品久久久久中文字幕| 国产精品卡一卡二卡三| 丰满少妇久久久久久久| 国产清纯美女被跳蛋高潮一区二区久久w| 亚洲bdsm女犯bdsm网站| 欧美吻胸吃奶大尺度电影| 亚洲女爱视频在线| 欧美日韩aaaaa| 免费人成网站在线观看欧美高清| 91精品国产综合久久精品app| 亚洲免费在线播放| 制服丝袜激情欧洲亚洲| 精品制服美女久久| 中日韩免费视频中文字幕| 欧美在线观看禁18| 国产精品久久久久影院亚瑟| 日本伊人色综合网| 在线观看91av| 日韩电影在线一区二区| 久久综合九色综合97_久久久| 欧美伊人精品成人久久综合97 | 香蕉久久夜色精品国产使用方法| 欧美日韩一区视频| 国产综合久久久久久久久久久久| 久久久久国产成人精品亚洲午夜| 99久久er热在这里只有精品66| 亚洲一区二区在线免费观看视频| 日韩欧美一级二级三级| 99精品久久只有精品| 精品一区二区三区欧美| 亚洲男人都懂的| 国产精品另类一区| 精品欧美乱码久久久久久1区2区| 99re这里都是精品| 国产不卡在线一区| 黄网站免费久久| 日韩中文字幕1| 午夜一区二区三区视频| 一区二区三区日韩欧美| 国产欧美日韩一区二区三区在线观看| 欧美年轻男男videosbes| 国产91精品在线观看| 一区二区三区中文字幕精品精品| 欧美韩国日本不卡|