?? rls_ar_pred.m
字號(hào):
function [Wo, xp, alpha, e] = rls_AR_pred(Xi, Y, verbose, lambda, delta)
% function [Wo, xp, alpha, e] = rls_AR_pred(Xi, Y, verbose, lambda, delta)
%
% rls_AR_pred.m - use basic RLS algorithm to predict real-valued AR process
% written for MATLAB 4.0
%
% Reference: Haykin, _Adaptive Filter Theory_, 2nd (corr.) ed., 1991
%
% Note that we use the algorithm in Table 13.2, i.e.,
% we do not exploit the Hermitian property of P(n), to
% minimize the possibility of numerical instability.
%
%
% Input parameters:
% Xi : matrix of training/test points - each row is
% considered a datum
% y : vector of corresponding desired outputs for
% predictor
% verbose: set to 1 for interactive processing
% lambda : the initial value of the forgetting factor
% delta : the initial value for the diagonal P matrix
%
% Output parameters:
% Wo : column-wise matrix of weights at each iteration
% xp : row vector of predicted outputs
% alpha : row vector of a priori prediction errors
% e : row vector of a posteriori prediction errors Y - xp
Nout = size(Xi, 2);
% length of maximum number of timesteps that can be predicted
N = size(Xi, 1);
% order of predictor
d = size(Xi, 2);
% initialize weight matrix and associated parameters for RLS predictor
W = zeros(Nout, 1);
Wo = [];
P = eye(d) / delta;
for n = 1:N,
% save weights
Wo = [Wo W];
% adapt weight matrix
u = Xi(n, :)';
p = u' * P;
kappa = lambda + p * u;
k = P * u / kappa;
alpha(n) = Y(n) - W' * u;
W = W + k * alpha(n);
Pp = k * p;
P = (P - Pp) / lambda;
% predict next sample and compute error
xp(n) = W' * u;
e(n) = Y(n) - xp(n);
if (verbose ~= 0)
disp(['time step ', int2str(n), ': mag. pred. err. = ', num2str(abs(e(n)))]);
end;
end % for n
?? 快捷鍵說明
復(fù)制代碼
Ctrl + C
搜索代碼
Ctrl + F
全屏模式
F11
切換主題
Ctrl + Shift + D
顯示快捷鍵
?
增大字號(hào)
Ctrl + =
減小字號(hào)
Ctrl + -