JXTA™ is a set of open, Generalized peer-to-peer (P2P) protocols that allow any networked device — sensors, cell phones, PDAs, laptops, workstations, servers and supercomputers — to communicate and collaborate mutually as peers
JXTA™ is a set of open, Generalized peer-to-peer (P2P) protocols that allow any networked device — sensors, cell phones, PDAs, laptops, workstations, servers and supercomputers — to communicate and collaborate mutually as peers
Single-layer neural networks can be trained using various learning algorithms. The best-known algorithms are the Adaline, Perceptron and Backpropagation algorithms for supervised learning. The first two are specific to single-layer neural networks while the third can be Generalized to multi-layer perceptrons.
GloptiPoly 3: moments, optimization and
semidefinite programming.
Gloptipoly 3 is intended to solve, or at least approximate, the Generalized Problem of
Moments (GPM), an infinite-dimensional optimization problem which can be viewed as
an extension of the classical problem of moments [8]. From a theoretical viewpoint, the
GPM has developments and impact in various areas of mathematics such as algebra,
Fourier analysis, functional analysis, operator theory, probability and statistics, to cite
a few. In addition, and despite a rather simple and short formulation, the GPM has a
large number of important applications in various fields such as optimization, probability,
finance, control, signal processing, chemistry, cristallography, tomography, etc. For an
account of various methodologies as well as some of potential applications, the interested
reader is referred to [1, 2] and the nice collection of papers [5].
The package includes 3 Matlab-interfaces to the c-code:
1. inference.m
An interface to the full inference package, includes several methods for
approximate inference: Loopy Belief Propagation, Generalized Belief
Propagation, Mean-Field approximation, and 4 monte-carlo sampling methods
(Metropolis, Gibbs, Wolff, Swendsen-Wang).
Use "help inference" from Matlab to see all options for usage.
2. gbp_preprocess.m and gbp.m
These 2 interfaces split Generalized Belief Propagation into the pre-process
stage (gbp_preprocess.m) and the inference stage (gbp.m), so the user may use
only one of them, or changing some parameters in between.
Use "help gbp_preprocess" and "help gbp" from Matlab.
3. simulatedAnnealing.m
An interface to the simulated-annealing c-code. This code uses Metropolis
sampling method, the same one used for inference.
Use "help simulatedAnnealing" from Matlab.
The toolbox solves a variety of approximate modeling problems for linear static models. The model can be parameterized in kernel, image, or input/output form and the approximation criterion, called misfit, is a weighted norm between the given data and data that is consistent with the model. There are three main classes of functions in the toolbox: transformation functions, misfit computation functions, and approximation functions. The approximation functions derive an approximate model from data, the misfit computation functions are used for validation and comparison of models, and the transformation functions are used for deriving one model representation from another.
KEYWORDS: Total least squares, Generalized total least squares, software implementation.