This is the mail archive of the gsl-discuss@sources.redhat.com mailing list for the GSL project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: High-dimensional Minimization without analytical derivatives


the latter are better tuned for 'zeroing in' the local maximums.  In
that regard, is Simplex Method closer to SA, or Gradient-based
methods?


Well, excuse me if I am completely off base, but as far as I am aware
the simplex method is restricted to *linear* problems - where it is
'guaranteed' to find the optimial solution. Gradient based methods and

I think you are off base here, though understandably so, There are two optimisation techniques known as 'simplex method'. I think you have in mind the better known one, used for linear programming. The one used to find minima of nonlinear functions is due to Nelder and Mead and is sometimes called 'simplex method' because it modifies a simplex (generalisation of a triangle to many dimensions) in searching for a solution.


I think of Nelder-Mead as closer to simulated annealing inasmuch as it cwon't necessarily move to the nearest local minimum and doesn't use gradients, analytic or estimated.

Calling simulated annealing SA is also potentially problematic because there is a Stochastic Approximation algorithm due to Keifer and Wolfowitz that is commonly known as SA. It can be used on nonstochastic problems by adding noise and has guaranteed convergence to a global optimum, albeit the convergence is often too slow for SA to be practical.

--
JDL


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]