This is the mail archive of the gsl-discuss@sourceware.org mailing list for the GSL project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: question on GSL development


James Bergstra <james.bergstra@umontreal.ca> writes:

> I am working on an extension to facilitate building and training
> neural networks (among other things).
> 
> I have some code in cvs at savannah under the project name "Montreal
> Scientific Library" for designing neural networks
> 
> Your comments on my approach would be greatly appreciated!

Thanks, James. I've downloaded the code.

> I was thinking about an approach for coding genetic algorithms, and I
> concluded(IMHO!) that the cleanest way to provide generic tools for
> solving GA problems and other problems in combinatorial optimization
> would be to establish a framework for optimizing a function on a
> *tensor*, the way the gsl_multimin_* routines optimize a function on a
> vector space.
> 
> Gibbs-sampling would be one algorithm for this, a GA with given
> recombination policies would be another, dynamic programming another, 
> and gradient-descent algorithms could be used too, when the values of
> the tensor elements are highly correlated in neighbourhoods.
> 
> Maybe you, or someone else would like to comment on these ideas,
> especially if you have some background in combinatorial optimization :)
> 
> James

Can you give some more detail on this approach? I've worked
fundamentally on steady-state GA, used to solve combinatorial
problems in computational chemistry.

Paco


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]