?? asa-readme+.txt
字號:
enhanced by combining it with some other powerful algorithms, e.g., to
produce an algorithm for parallel computation[6]. ASA is now used
world-wide across many disciplines[7,8,9,10], including specific
disciplines such as finance[11,12,13,14,15,16],
neuroscience[17,18,19,20], and combat analyses[21,22,23,24,25]. Some
papers illustrate the combined use of ASA for optimization and
sampling[26]. The http://www.ingber.com/asa_papers.html file in the
ASA archive contains references to some patents and papers using ASA
and VFSR.
[1m5.2. Outline of ASA Algorithm[0m
Details of the ASA algorithm are best obtained from the published
papers. There are three parts to its basic structure.
[1m5.2.1. Generating Probability Density Function[0m
In a D-dimensional parameter space with parameters p^i having
ranges [A_i, B_i], about the k'th last saved point (e.g, a local
optima), p_k^i, a new point is generated using a distribution defined
by the product of distributions for each parameter, g^i(y^i; T_i), in
- 8 -
Adaptive Simulated Annealing (ASA) Lester Ingber
terms of random variables y^i in [-1, 1], where p_k+1^i = p_k^i +
y^i(B_i - A_i), and "temperatures" T_i,
g^i(y^i; T_i) = 1/[2(|y^i| + T_i)(1 + 1/T_i)].
The DEFINE_OPTIONS USER_GENERATING_FUNCTION permits using an
alternative to this ASA distribution function.
[1m5.2.2. Acceptance Probability Density Function[0m
The cost functions, C(p_k+1) - C(p_k), are compared using a
uniform random generator, U in [0, 1), in a "Boltzmann" test: If
exp[-(C(p_k+1) - C(p_k))/T_cost] > U,
where T_cost is the "temperature" used for this test, then the new
point is accepted as the new saved point for the next iteration.
Otherwise, the last saved point is retained. The DEFINE_OPTIONS
USER_ACCEPT_ASYMP_EXP or USER_ACCEPT_THRESHOLD permit using
alternatives to this Boltzmann distribution function.
[1m5.2.3. Reannealing Temperature Schedule[0m
The annealing schedule for each parameter temperature, T_i, from
a starting temperature T_i0, is
T_i(k_i) = T_0i exp(-c_i k_i^(1/D)).
This is discussed further below.
The annealing schedule for the cost temperature is developed
similarly to the parameter temperatures. However, the index for
reannealing the cost function, k_cost, is determined by the number of
accepted points, instead of the number of generated points as used for
the parameters. This choice was made because the Boltzmann acceptance
criteria uses an exponential distribution which is not as fat-tailed
as the ASA distribution used for the parameters. This schedule can be
modified using several OPTIONS. In particular, the Pre-Compile
DEFINE_OPTIONS USER_COST_SCHEDULE permits quite arbitrary functional
modifications for this annealing schedule, and the Pre-Compile
DEFINE_OPTIONS
As determined by the Program Options selected, the parameter
"temperatures" may be periodically adaptively reannealed, or increased
relative to their previous values, using their relative first
derivatives with respect to the cost function, to guide the search
"fairly" among the parameters.
As determined by the Program Options selected, the reannealing of
the cost temperature resets the scale of the the annealing of the cost
acceptance criteria as
T_cost(k_cost) = T_0cost exp(-c_cost k_cost^(1/D)).
The new T_0cost is taken to be the minimum of the current initial cost
temperature and the maximum of the absolute values of the best and
last cost functions and their difference. The new k_cost is
calculated taking T_cost as the maximum of the current value and the
absolute value of the difference between the last and best saved
minima of the cost function, constrained not to exceed the current
initial cost temperature. This procedure essentially resets the scale
- 9 -
Adaptive Simulated Annealing (ASA) Lester Ingber
of the annealing of the cost temperature within the scale of the
current best or last saved minimum.
This default algorithm for reannealing the cost temperature,
taking advantage of the ASA importance sampling that relates most
specifically to the parameter temperatures, while often is quite
efficient for some systems, may lead to problems in dwelling too long
in local minima for other systems. In such case, the user may also
experiment with alternative algorithms effected using the
Reanneal_Cost OPTIONS, discussed below. For example, ASA provides an
alternative calculation for the cost temperature, when Reanneal_Cost <
-1 or > 1, that periodically calculates the initial and current cost
temperatures or just the initial cost temperature, resp., as a
deviation over a sample of cost functions.
These reannealing algorithms can be changed adaptively by the
user as described below in the sections USER_REANNEAL_COST and
USER_REANNEAL_PARAMETERS.
[1m5.3. Efficiency Versus Necessity[0m
ASA is not necessarily an "efficient" code. For example, if you
know that your cost function to be optimized is something close to a
parabola, then a simple gradient Newton search method most likely
would be faster than ASA. ASA is believed to be faster and more
robust than other simulated annealing techniques for [4mmost[24m complex
problems with multiple local optima; again, be careful to note that
some problems are best treated by other algorithms. If you do not
know much about the structure of your system, and especially if it has
complex constraints, and you need to search for a global optimum, then
this ASA code is heartily recommended to you.
In the context of efficiency and necessity, the user should be
alert to recognize that any sampling or optimization program generally
should be considered as complementary, not as a substitute, to gaining
knowledge of a particular system. Unlike relatively "canned" codes
that exist for (quasi-)linear systems, nonlinear systems typically are
non-typical. Often some homework must be done to understand the
system, and tuning often is required of numerical algorithms such as
ASA. For example, while principal component analyses (PCA) often
suffices to generate good (quasi-)orthogonal or (quasi-)independent
sets of parameters, this is not true for general nonlinear systems.
While such innovations as reannealing take good advantage of ASA which
offers independent distributions for each parameter, this generally
may not be a good substitute for a user-defined front-end, e.g.,
before the call to asa () or even embedded within the cost_function
(), to interpret and define relevant parameters.
The ASA-NOTES file contains the sections @@Number of Generated
States Required and @@Judging Importance-Sampling, recommending use of
log-log plots to extrapolate the number of generated states required
to attain a global minimum, possibly as a function of selected
OPTIONS.
- 10 -
Adaptive Simulated Annealing (ASA) Lester Ingber
[1m6. Outline of Use[0m
Set up the ASA interface: Your program should be divided into two
basic modules. (1) The user calling procedure, containing the cost
function to be minimized (or its negative if you require a global
maximum), is contained in asa_usr.c, asa_usr.h and asa_usr_cst.c. (2)
The ASA optimization procedure, is contained in asa.c and asa.h. The
file asa_usr_asa.h contains definitions and macros common to both
asa.h and asa_usr.h. Furthermore, there are some options to
explore/read below. It is assumed there will be no confusion over the
standard uses of the term "parameter" in different contexts, e.g., as
an element passed by a subroutine or as a physical coefficient in a
cost function.
ASA has been run successfully on many machines under many
compilers. To check out your own system, you can run `make` (or the
equivalent set of commands in the ASA-Makefile), and compare your
asa_out and asa_usr_out files to the asa_test_asa and asa_test_usr
files, respectively, provided with this code. No attempt was made to
optimize any compiler, so that the test runs do not really signify any
testing of compilers or architectures; rather they are meant to be
used as a guide to determine what you might expect on your own
machine.
The major sections below describe the compilation procedures, the
Program Options available to you to control the code, the use of
templates to set up your user module and interface to the asa module,
and how to submit bug reports.
If you already have your own cost function defined, you can
insert it into asa_usr_cst.c. If you wish to insert more OPTIONS, as
a quick guide to get started, you can search through asa_usr.c and the
ASA-Makefile for all occurrences of "MY_TEMPLATE_" to insert the
necessary definitions required to run ASA. If you use both
OPTIONS_FILE and OPTIONS_FILE_DATA set to TRUE, then usually most such
information can be placed in the asa_opt file, and then only the
cost_function () must be inserted. The place to insert the
cost_function () is marked by "MY_TEMPLATE_cost."
[1m7. ASA-Makefile/Compilation Procedures[0m
The ASA-Makefile is intended to be a template for your own
Makefile. For quick use, just copy this file to Makefile, which will
be recognized by any standard make tool.
The PostScript(R) ASA-README.ps and ASCII ASA-README.txt and
ASA-README+.txt files were generated using `make doc`. The
ASA-Makefile describes some options for formatting these files
differently. Use `make` or `make all` to compile and run asa_run, the
executable prepared for the test function. Examine the ASA-Makefile
to determine the "clean" options available.
- 11 -
Adaptive Simulated Annealing (ASA) Lester Ingber
Since complex problems by their nature are often quite unique, it
is unlikely that the default parameters are just right for your
problem. However, experience has shown that if you [4ma[24m [4mpriori[24m do not
have any reason to determine your own parameters, then you might do
just fine using these defaults, and these are recommended as a
first-order guess. These defaults can be changed simply by adding to
the DEFINE_OPTIONS line in the ASA-Makefile, by passing options on
your command line, and by changing structure elements in the user or
asa module as described below. Depending on how you integrate ASA
into your own user modules, you may wish to modify this ASA-Makefile
or at least use some of these options in your own compilation
procedures.
Note that the ASA-Makefile is just a convenience, not a
necessity, to use ASA. E.g., on systems which do not support this
utility, you may simply compile the files following the guidelines in
the ASA-Makefile, taking care to pass the correct DEFINE_OPTIONS to
your compilation commands at your shell prompt. Still another way,
albeit not as convenient, is to make the desired changes in the
asa_usr_asa.h, and asa.h or asa_usr.h files as required.
Since the ASA-Makefile contains comments giving short
descriptions of some options, it should be considered as an extension
of this documentation file. For convenience, most of this information
is repeated below. However, to see how they can be used in
compilations, please read through the ASA-Makefile.
For example, to run the ASA test problem using the gcc compiler,
you could just type at your "%" prompt:
% cp ASA-Makefile Makefile
% gcc -g -DASA_TEST=TRUE -o asa_run asa_usr.c asa_usr_cst.c asa.c -lm
% asa_run
If you have defined your own cost function in asa_usr_cst.c or
within the "MY_TEMPLATE_" guides in asa_usr.c, then ASA_TEST should be
set to FALSE (the default if ASA_TEST is not defined in your
compilation lines or in the ASA-Makefile). The code for ASA_TEST=TRUE
is given just above these guides as a template to use for your own
cost function.
The easiest way for many users to quickly use ASA likely is to
invoke the COST_FILE, OPTIONS_FILE, and OPTIONS_FILE_DATA OPTIONS (the
default), using the files asa_usr_cst.c and asa_opt as templates.
This is further described below and illustrated in the
http://www.ingber.com/asa_examples.txt file in the section Use of
COST_FILE on Shubert Problem.
[1m7.1. DLL ASA-Makefile[0m
Under Cygwin (cygwin.com), set ASA_LIB to TRUE and INCL_STDOUT to
FALSE (OPTIONS described below), with the command
% make asadll
to produce a DLL to call asa_main() as a DLL function under windows.
?? 快捷鍵說明
復制代碼
Ctrl + C
搜索代碼
Ctrl + F
全屏模式
F11
切換主題
Ctrl + Shift + D
顯示快捷鍵
?
增大字號
Ctrl + =
減小字號
Ctrl + -