Welcome to the HNN development efforts
jrm8005 at gmail.com
Mon Feb 1 21:15:28 EST 2010
On Mon, Feb 1, 2010 at 7:46 PM, Thomas Bereknyei <tomberek at gmail.com> wrote:
>> On the scalability front: anything that can be done to facilitate
>> parallelization and distribution of computations might be important.
>> Eventually, people will be asking for bells and whistles like Hopfield
>> neural networks and PCNNs.
> That's why I think we should build a framework with 'defaults' that
> can be utilized and overridden. That way there is MUCH less coding
> someone would have to do to implement a new type of NN. That's why i
> was thinking that creating classes of nets and neurons/nodes would be
> the way to go.
> On parallelization: again, with set defaults, someone could simply add
> a 'par' , 'pseq'' and 'seq' somewhere in an evaluation process. We
> can provide default parallel code as well.
> My outlook on this whole project is to allow experimentation with
> models, instead of providing a single set model to work with.
Agreed. In fact, it might behoove us to think even more radically:
perhaps most of HNN should be typeclasses. While this may not be a
workable idea, thinking along such lines may help us come up with good
The GA code I wrote seems to be heading in that direction. 90 percent
of the code is just functions that could easily be rewritten to use
More information about the Hnn