And we created a similar network for the AND operator.
To run these examples you would type the following commands:
To build the bytecode: $ ant (from the encog-examples-3.0.1 directory) To execute (from the lib folder) $ java -cp encog-core-3.0.1-SNAPSHOT.jar:examples.jar org.encog.examples.neural.xor.XORHelloWorld $ java -cp encog-core-3.0.1-SNAPSHOT.jar:examples.jar org.encog.examples.neural.and.AND
These examples use a training method known asResilient Propagation. This is a more sophisticated version of back propagation in which it is possible to associate different deltas -- values by which to change the weights -- with each neuron in a particular layer.
A Hopfield Network is an example of a self-connected layer -- i.e., a single-layer network in which the neurons are connected to each other, but no neuron is connected to itself. Each neuron is a binary threshhold unit -- i.e., its activation maps to two possible values (0,1) or (-1, 1).
A Hopfield network is guaranteed to converge to a local minimum value, but they are not guaranteed to learn every stored pattern. They provide a model for human memory.
Hopfield networks use a form
of Hebbian
learning -- i.e., the associations between to neurons are
strengthened the more the two neurons are activated at the same time.
Here's
an Encog
example
Hopfield networks are good for recognizing patterns, including degraded examples of a learned pattern,
In Encog, the patterns can be created with a HopfieldPattern object, which allows you to create
patterns as an array of Java strings -- i.e., a grid. The number of neurons correspond to the number
of cells in the grid.
Training a Hopfield network involves lowering the energy of states that the net should remember.
In other words, the network is properly trained when the energy of states that the network should
remember are local minima.
To execute (from the lib folder)
$ java -cp encog-core-3.0.1-SNAPSHOT.jar:examples.jar org.encog.examples.neural.hopfield.HopfieldAssociate
Exercises
: