This is the mail archive of the gsl-discuss@sources.redhat.com mailing list for the GSL project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: Parameter vectors declared as const in minimized functions


Hi!

> >Indeed; however, the NMS is derivativeless method, so the gradients are
> >not problematic. The problem is that sometimes the model may yield
> >unphysical results simply because the fit to observations is better.
> >That's why it would be useful to at least partially constrain it.
> >Namely, I'm modeling binary stars; the model may sometimes be better
> >off joining both stars into one although that clearly cannot be the
> >case. That's why I'd like to constrain the distance between both stars
> >to be larger than, say, the sum of both stellar radii.
> >
> You are right concerning the derivateless character. I don't use this 
> method, so I was not cautious about this fact in my answer. I 
> nevetherless wonder how good is this method, since it is not so popular 
> in nonlinear programming. Your remark concerning the physical meaning 
> makes sense, as well as in other applications. But unconstrained solver 
> are easier to construct, and therefore more popular.

NMS gives the impression that is not popular because it is based on
function evaluations, so it is substantially slower than other
minimization algorithms. However, for highly non-linear systems where the
derivative isn't known or it doesn't behave nicely, NMS is very robust and
very successful for solving them. From what you are saying, it seems to me
NMS is one of the most under-appreciated methods for non-linear multi-D
minimization.

> >>You should check if there is any bounds-constrained solver, or try a 
> >>barrier approach to force the iterates to be feasible at each step of 
> >>the optmization.
> >
> >That's another problem; you see, I practically never have a fully
> >constrained simplex, so I can't deploy constrained minimization of any
> >kind. NMS is pretty promising for my needs, I only need to restrict the
> >minimization in several dimensions to get physical results. But rest
> >assured, I agree with you when you say one should be extra careful by
> >doing it this way.
> 
> There is different ways to deal with bounds constrained, and several 
> solvers can be freely used. See for instance
> http://www-neos.mcs.anl.gov/neos/server-solver-types.html.

Thanks, I thought NEOS was only a server and that it doesn't actually
offer any code to be downloaded; I'll take a look at it shortly! :)

> If you still want to use gsl, a simple way to use an unconstrained 
> approach while impose some constraints is to add a log-barrier term:
> mu*log(x-min) or mu*log(max-x)
> By using a not too large mu (that can be the same for each constraint), 
> a feasible starting point, you ensure that the solution is feasible and 
> close to the true constrained optimizer. You can even refine the 
> solution by restarting the optimization with a smaller mu.

That's pretty much what I'm doing right now - within the cost (chi2)
function, I check whether the parameter value is outside the allowed
interval and if it is, I multiply chi2 by 100. That way I artificially
impose a barrier to the solution - so it is even more crude than the
method you suggested above.

> Note that the approach is very crude, and has been refined in the last 
> years (especially with the interior-points approach), but perhaps it 
> could help you.

I'll take a look into it shortly. Thanks again for all the effort and
feedback! :)

Andrej


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]