ADIAL Basis Function (RBF) networks were introduced
into the neural network literature by Broomhead and
Lowe [1], which are motivated by observation on the local
response in biologic neurons. Due to their better
approximation capabilities, simpler network structures and
faster learning algorithms, RBF networks have been widely applied in many science and engineering fields. RBF network is three layers feedback network, where each hidden unit implements a radial activation function and each output unit implements a weighted sum of hidden units’ outputs.
Abstract—Wireless networks in combination with image
sensors open up a multitude of previously unthinkable sensing
applications. Capable tools and testbeds for these wireless image
sensor networks can greatly accelerate development of complex,
yet efficient algorithms that meet application requirements. In this
paper, we introduce WiSNAP, a Matlab-based application
development platform intended for wireless image sensor
networks. It allows researchers and developers of such networks
to investigate, design, and evaluate algorithms and applications
using real target hardware. WiSNAP offers standardized and
easy-to-use Application Program Interfaces (APIs) to control
image sensors and wireless motes, which do not require detailed
knowledge of the target hardware. Nonetheless, its open system
architecture enables support of virtually any kind of sensor or
wireless mote. Application examples are presented to illustrate the
usage of WiSNAP as a powerful development tool.
% Train a two layer neural network with the Levenberg-Marquardt
% method.
%
% If desired, it is possible to use regularization by
% weight decay. Also pruned (ie. not fully connected) networks can
% be trained.
%
% Given a set of corresponding input-output pairs and an initial
% network,
% [W1,W2,critvec,iteration,lambda]=marq(NetDef,W1,W2,PHI,Y,trparms)
% trains the network with the Levenberg-Marquardt method.
%
% The activation functions can be either linear or tanh. The
% network architecture is defined by the matrix NetDef which
% has two rows. The first row specifies the hidden layer and the
% second row specifies the output layer.
This function applies the Optimal Brain Surgeon (OBS) strategy for
% pruning neural network models of dynamic systems. That is networks
% trained by NNARX, NNOE, NNARMAX1, NNARMAX2, or their recursive
% counterparts.
Train a two layer neural network with a recursive prediction error
% algorithm ("recursive Gauss-Newton"). Also pruned (i.e., not fully
% connected) networks can be trained.
%
% The activation functions can either be linear or tanh. The network
% architecture is defined by the matrix NetDef , which has of two
% rows. The first row specifies the hidden layer while the second
% specifies the output layer.