This is the mail archive of the
gsl-discuss@sources.redhat.com
mailing list for the GSL project.
Re: multifit/Levenberg-Marquardt
- From: Sanjay Bhatnagar <sbhatnag at aoc dot nrao dot edu>
- To: Brian Gough <bjg at network-theory dot co dot uk>
- Cc: sbhatnag at zia dot aoc dot nrao dot edu, gsl-discuss at sources dot redhat dot com
- Date: Wed, 24 Sep 2003 08:40:07 -0600
- Subject: Re: multifit/Levenberg-Marquardt
- References: <16235.6607.74025.885210@gargle.gargle.HOWL><16241.35055.297862.273899@debian.local>
- Reply-to: sbhatnag at aoc dot nrao dot edu
Brian Gough writes:
> Sanjay Bhatnagar writes:
> > I have been using GSL for my work on image deconvolution. I need to
> > use the Levenberg-Marquardt algorithm for Non-linear minimization.
> > However the problem I am solving involves large data size as well as a
> > large no. of parameters.
>
> For image deconvolution I would recommend using a specialised
> algorithm -- the least-squares fitting routines in GSL are intended
> for ordinary data, rather than images.
>
That's not the point. The point I was trying to make was that GSL
implementation is inefficient (from the mem. usage point of view) for
large data sets. GSL needs mem of size NxP where P is the no. of
parameters and N is the data size. In contrast, the Numerical Recipes
implementation better manages the mem. requirements (needs mem. for
N+P floats/doubles). So unless there is some technical advantage in
doing the way it's done in GSL, it will be better to improve it on
that front. And if there IS a technical reason for doing it this way,
it may be useful to have a separate mem. efficient implementation ALSO
in place.
Just to reiterate - the problem is not that I am using images. I can
give examples where "ordinary data" is also large (I wonder what's
the difference between images and "ordinary data"?).
Regards,
sanjay