Click or drag to resize
Accord.NET (logo)

ParallelResilientBackpropagationLearning Class

Resilient Backpropagation learning algorithm.
Inheritance Hierarchy
SystemObject
  Accord.Neuro.LearningParallelResilientBackpropagationLearning

Namespace:  Accord.Neuro.Learning
Assembly:  Accord.Neuro (in Accord.Neuro.dll) Version: 3.8.0
Syntax
public class ParallelResilientBackpropagationLearning : ISupervisedLearning, 
	IDisposable
Request Example View Source

The ParallelResilientBackpropagationLearning type exposes the following members.

Constructors
  NameDescription
Public methodParallelResilientBackpropagationLearning
Initializes a new instance of the ParallelResilientBackpropagationLearning class.
Top
Properties
  NameDescription
Public propertyDecreaseFactor
Gets the decrease parameter, also referred as eta minus. Default is 0.5.
Public propertyIncreaseFactor
Gets the increase parameter, also referred as eta plus. Default is 1.2.
Public propertyUpdateLowerBound
Gets or sets the minimum possible update step, also referred as delta max. Default is 1e-6.
Public propertyUpdateUpperBound
Gets or sets the maximum possible update step, also referred as delta min. Default is 50.
Top
Methods
  NameDescription
Public methodComputeError
Compute network error for a given data set.
Public methodDispose
Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.
Protected methodDispose(Boolean)
Releases unmanaged and - optionally - managed resources
Public methodEquals
Determines whether the specified object is equal to the current object.
(Inherited from Object.)
Protected methodFinalize
Releases unmanaged resources and performs other cleanup operations before the ParallelResilientBackpropagationLearning is reclaimed by garbage collection.
(Overrides ObjectFinalize.)
Public methodGetHashCode
Serves as the default hash function.
(Inherited from Object.)
Public methodGetType
Gets the Type of the current instance.
(Inherited from Object.)
Protected methodMemberwiseClone
Creates a shallow copy of the current Object.
(Inherited from Object.)
Public methodReset
Resets the current update steps using the given learning rate.
Public methodRun
Runs learning iteration.
Public methodRunEpoch
Runs learning epoch.
Public methodToString
Returns a string that represents the current object.
(Inherited from Object.)
Top
Extension Methods
  NameDescription
Public Extension MethodHasMethod
Checks whether an object implements a method with the given name.
(Defined by ExtensionMethods.)
Public Extension MethodIsEqual
Compares two objects for equality, performing an elementwise comparison if the elements are vectors or matrices.
(Defined by Matrix.)
Public Extension MethodTo(Type)Overloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Public Extension MethodToTOverloaded.
Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.
(Defined by ExtensionMethods.)
Top
Remarks

This class implements the resilient backpropagation (RProp) learning algorithm. The RProp learning algorithm is one of the fastest learning algorithms for feed-forward learning networks which use only first-order information.

Examples

Sample usage (training network to calculate XOR function):

// initialize input and output values
double[][] input = 
{
    new double[] {0, 0}, new double[] {0, 1},
    new double[] {1, 0}, new double[] {1, 1}
};

double[][] output = 
{
    new double[] {0}, new double[] {1},
    new double[] {1}, new double[] {0}
};

// create neural network
ActivationNetwork   network = new ActivationNetwork(
    SigmoidFunction(2),
    2, // two inputs in the network
    2, // two neurons in the first layer
    1 ); // one neuron in the second layer

// create teacher
var teacher = new ResilientBackpropagationLearning(network);

// loop
while (!needToStop)
{
    // run epoch of learning procedure
    double error = teacher.RunEpoch( input, output );
    // check error value to see if we need to stop
    // ...
}

The following example shows how to use Rprop to solve a multi-class classification problem.

// Suppose we would like to teach a network to recognize 
// the following input vectors into 3 possible classes:
// 
double[][] inputs =
{
    new double[] { 0, 1, 1, 0 }, // 0
    new double[] { 0, 1, 0, 0 }, // 0
    new double[] { 0, 0, 1, 0 }, // 0
    new double[] { 0, 1, 1, 0 }, // 0
    new double[] { 0, 1, 0, 0 }, // 0
    new double[] { 1, 0, 0, 0 }, // 1
    new double[] { 1, 0, 0, 0 }, // 1
    new double[] { 1, 0, 0, 1 }, // 1
    new double[] { 0, 0, 0, 1 }, // 1
    new double[] { 0, 0, 0, 1 }, // 1
    new double[] { 1, 1, 1, 1 }, // 2
    new double[] { 1, 0, 1, 1 }, // 2
    new double[] { 1, 1, 0, 1 }, // 2
    new double[] { 0, 1, 1, 1 }, // 2
    new double[] { 1, 1, 1, 1 }, // 2
};

int[] classes =
{
    0, 0, 0, 0, 0,
    1, 1, 1, 1, 1,
    2, 2, 2, 2, 2,
};

// First we have to convert this problem into a way that  the neural
// network can handle. The first step is to expand the classes into 
// indicator vectors, where a 1 into a position signifies that this
// position indicates the class the sample belongs to.
// 
double[][] outputs = Measures.Expand(classes, -1, +1);

// Create an activation function for the net
var function = new BipolarSigmoidFunction();

// Create an activation network with the function and
//  4 inputs, 5 hidden neurons and 3 possible outputs:
var network = new ActivationNetwork(function, 4, 5, 3);

// Randomly initialize the network
new NguyenWidrow(network).Randomize();

// Teach the network using parallel Rprop:
var teacher = new ParallelResilientBackpropagationLearning(network);

double error = 1.0;
while (error > 1e-5)
    error = teacher.RunEpoch(inputs, outputs);


// Checks if the network has learned
for (int i = 0; i < inputs.Length; i++)
{
    double[] answer = network.Compute(inputs[i]);

    int expected = classes[i];
    int actual; answer.Max(out actual);

    // actual should be equal to expected
}
See Also