This is the mail archive of the
gsl-discuss@sources.redhat.com
mailing list for the GSL project.
RE: High-dimensional Minimization without analytical derivatives
- From: "Anatoliy Belaygorod" <belaygorod at wustl dot edu>
- To: "Joakim Hove" <hove at ift dot uib dot no>
- Cc: <gsl-discuss at sources dot redhat dot com>
- Date: Sat, 4 Sep 2004 09:48:01 -0500
- Subject: RE: High-dimensional Minimization without analytical derivatives
> as far as I am aware
the simplex method is restricted to *linear* problems
Well, that's what I thought too at first, but in the example provided in
the "multidimensional Minimization" chapter their target function is
quadratic with respect to variables over which they minimize. Also, I
have heard testimonies of people successfully using _nmsimplex method
for some other 'highly'-nonlinear problems.
Oops, it looks like JDL already answered my question a minute ago.
JDL is right, the main characteristic that I cared about was whether or
not Simulated Annealing is robust enough to jump from one 'valley' to
the next in the search of the GLOBAL min, unlike gradient based methods
that are more likely to get stuck in a local min.
Thank you.
Anatoliy