ResilientBackpropagationLearning Class |
Namespace: Accord.Neuro.Learning
The ResilientBackpropagationLearning type exposes the following members.
Name | Description | |
---|---|---|
ResilientBackpropagationLearning |
Initializes a new instance of the ResilientBackpropagationLearning class.
|
Name | Description | |
---|---|---|
LearningRate |
Learning rate.
|
Name | Description | |
---|---|---|
Equals | Determines whether the specified object is equal to the current object. (Inherited from Object.) | |
Finalize | Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection. (Inherited from Object.) | |
GetHashCode | Serves as the default hash function. (Inherited from Object.) | |
GetType | Gets the Type of the current instance. (Inherited from Object.) | |
MemberwiseClone | Creates a shallow copy of the current Object. (Inherited from Object.) | |
Run |
Runs learning iteration.
| |
RunEpoch |
Runs learning epoch.
| |
ToString | Returns a string that represents the current object. (Inherited from Object.) |
Name | Description | |
---|---|---|
HasMethod |
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.) | |
IsEqual |
Compares two objects for equality, performing an elementwise
comparison if the elements are vectors or matrices.
(Defined by Matrix.) | |
To(Type) | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) | |
ToT | Overloaded.
Converts an object into another type, irrespective of whether
the conversion can be done at compile time or not. This can be
used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.) |
This class implements the resilient backpropagation (RProp) learning algorithm. The RProp learning algorithm is one of the fastest learning algorithms for feed-forward learning networks which use only first-order information.
Sample usage (training network to calculate XOR function):
// initialize input and output values double[][] input = new double[4][] { new double[] {0, 0}, new double[] {0, 1}, new double[] {1, 0}, new double[] {1, 1} }; double[][] output = new double[4][] { new double[] {0}, new double[] {1}, new double[] {1}, new double[] {0} }; // create neural network ActivationNetwork network = new ActivationNetwork( SigmoidFunction( 2 ), 2, // two inputs in the network 2, // two neurons in the first layer 1 ); // one neuron in the second layer // create teacher ResilientBackpropagationLearning teacher = new ResilientBackpropagationLearning( network ); // loop while ( !needToStop ) { // run epoch of learning procedure double error = teacher.RunEpoch( input, output ); // check error value to see if we need to stop // ... }